what(): std::bad_alloc

Hi everyone!
I'm writing a program that requires tensors.
Managed to make it work for a small test (small tensor size), but as soon as I made the tensor size bigger I got the following when executing the program:


terminate called after throwing an instance of 'std::bad_alloc'
what(): std::bad_alloc
Aborted


I've looked around the forum and the web and the problem is that I'm running out of memory, however the tensor size that the generates the error is not that big (I think).

Here is the header of the program and the point at which it is aborted:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
//#include losts of stuff

using namespace std;

//functions prototypes

int main(int argc, char * argv[]){

  int dim, diam;
  double *centrality;
  int **matr, **distances, **gij;
  double ***geo;
  string inname, outname; 

  if(argc == 5){
    dim = int(atof(argv[1]));
    diam = int(atof(argv[2]));
    inname = argv[3];
    outname = argv[4];
  }
  else{
    cout << "Problem with the line inputs, go pester someone else." << endl;;
    return 1;
  }

  //Create the adjacency and distance matrices
  matr = new int*[dim];
  distances = new int*[dim];
  gij = new int*[dim];
  for(int i=0; i<dim; i++){
    matr[i] = new int[dim];
    distances[i] = new int[dim];
    gij[i] = new int[dim];
  }
  cout << "Matrices created.\n"; 

  //Tensor of booleans (dim X dim X dim) where geo[i][j][k] is True if node k is part of the geodesic between i and j
  geo = new double**[dim];
  for(int i=0; i<dim; i++){
    geo[i] = new double*[dim];
    for(int j=0; j < dim; j++){
      geo[i][j] = new double[dim];
    }
  }
  
  cout << "geo built.\n";

//After this I do more stuff, but the code breaks before the last cout


And here is the shell output after calling the program:

./a.out 1800 88 foo.dat foo.txt 
Matrices created.
terminate called after throwing an instance of 'std::bad_alloc'
  what():  std::bad_alloc
Aborted


So I guess the problem is when I try to ask for memory on a 1800x1800x1800 tensor.

Any ideas on how to fix this?

Thanks in advance!

If my calculations are correct, you are asking for about 23.33 GB of memory with those numbers.

The only solution I can really give is either:
1) Don't go that high
or
2) Get more memory
I dont see any problem with your code but by 1800^3 is a huge number so it wouldnt be that surprising if it ran out of memory. Why do you need such an enormous array?
Crud... no wonder it goes to hell.

I need such an enormous array for a research project I'm working on.
Basically, the issue is that I need to calculate the betweenness centrality of a network and the "fast" methods (from standard libraries in R) give approximate answers that I'm having problems with (I think they are way off the real value). Therefore I decided to do an exact calculation and I need the huge array for this.

Thanks for quick reply... how can I get more memory? That is without buying it, the hard drive where I run the sims has more than that.

Last edited on
Do you really need every element in a 3d 1800 wide array? Or do you just want an array because it represents your data in a way that's easy to understand and manipulate? If the majority of the elements would be unused you could make a pointer to an array that size and only allocate memory to the elements you use on a per use basis.
@quirkyusername (111)
You are right, I think I'm going to modify the structure so that I only keep relevant info.
Thanks for the suggestion!
Topic archived. No new replies allowed.