Hi everyone!
I'm writing a program that requires tensors.
Managed to make it work for a small test (small tensor size), but as soon as I made the tensor size bigger I got the following when executing the program:
terminate called after throwing an instance of 'std::bad_alloc'
what(): std::bad_alloc
Aborted
I've looked around the forum and the web and the problem is that I'm running out of memory, however the tensor size that the generates the error is not that big (I think).
Here is the header of the program and the point at which it is aborted:
//#include losts of stuff
usingnamespace std;
//functions prototypes
int main(int argc, char * argv[]){
int dim, diam;
double *centrality;
int **matr, **distances, **gij;
double ***geo;
string inname, outname;
if(argc == 5){
dim = int(atof(argv[1]));
diam = int(atof(argv[2]));
inname = argv[3];
outname = argv[4];
}
else{
cout << "Problem with the line inputs, go pester someone else." << endl;;
return 1;
}
//Create the adjacency and distance matrices
matr = newint*[dim];
distances = newint*[dim];
gij = newint*[dim];
for(int i=0; i<dim; i++){
matr[i] = newint[dim];
distances[i] = newint[dim];
gij[i] = newint[dim];
}
cout << "Matrices created.\n";
//Tensor of booleans (dim X dim X dim) where geo[i][j][k] is True if node k is part of the geodesic between i and j
geo = newdouble**[dim];
for(int i=0; i<dim; i++){
geo[i] = newdouble*[dim];
for(int j=0; j < dim; j++){
geo[i][j] = newdouble[dim];
}
}
cout << "geo built.\n";
//After this I do more stuff, but the code breaks before the last cout
And here is the shell output after calling the program:
./a.out 1800 88 foo.dat foo.txt
Matrices created.
terminate called after throwing an instance of 'std::bad_alloc'
what(): std::bad_alloc
Aborted
So I guess the problem is when I try to ask for memory on a 1800x1800x1800 tensor.
I dont see any problem with your code but by 1800^3 is a huge number so it wouldnt be that surprising if it ran out of memory. Why do you need such an enormous array?
I need such an enormous array for a research project I'm working on.
Basically, the issue is that I need to calculate the betweenness centrality of a network and the "fast" methods (from standard libraries in R) give approximate answers that I'm having problems with (I think they are way off the real value). Therefore I decided to do an exact calculation and I need the huge array for this.
Thanks for quick reply... how can I get more memory? That is without buying it, the hard drive where I run the sims has more than that.
Do you really need every element in a 3d 1800 wide array? Or do you just want an array because it represents your data in a way that's easy to understand and manipulate? If the majority of the elements would be unused you could make a pointer to an array that size and only allocate memory to the elements you use on a per use basis.