class sample
{
public:
staticconstunsignedint d = 5;
staticconstunsignedlongint M = 10000;
double val[d][M];
};
In main, I created an array for object of type sample.
1 2 3 4
int main(){
constint N = 10;
sample bm_sample[N];
}
The compile ($ g++ main.cpp free_bndry.cpp) goes through, but when I execute ./a.exe it gives me Segmentation fault (core dumped).
When I set constint N = 1; instead of 10, Segmentation fault (core dumped) goes away.
My machine has 8G of memory but windows is using 70% of it.
What is the issue?
How much memory do I need to make M 1 million, and N 40?
Your bm_sample array will take almost 2 MB on stack. Default stack size on MSVC and gcc from MinGW is 1 MB: stack owerflow here.
You can:
1) Increase stack size (not recommended)
2) Create objects in memory like bradw proposed (recommended)
3) Learn why the heck Windows devour 6 GB of memory alone.
Thank you bardw and MiiNiPaa.
The program I am writing needs much larger arrays, let's say an array of 5*1,000,000*40. The challenge here is that I can not delete them to free the memory. To start the program I have to keep all variables. But, later on I can delete them little by little. The time also matters so I can not write them in a file and call them according to my memory capacity.
I would be able to free the memory from windows occupation to 6GB. Is there any chance I can use at lease 4 GB of it if I follow bradw's suggestion?
Thank you for all your help, guys.
To conclude:
I terminated as much application as possible from windows to free 6GB of memory. Then I changed declaration inside my class to
1 2 3 4 5 6 7 8 9
class sample
{
public:
staticconstunsignedint d = 5;
staticconstunsignedlongint M = 10000000;
double** val = newdouble* [d];
for (int i = 0; i<d; ++i)
double* val[i] = newdouble [M];
};
and my program to
1 2 3 4 5 6
int main(){
staticconstunsignedint d = 5;
staticconstunsignedlongint M = 100000;
constint N = 10;
sample* bm_sample = new sample[N];
}