What exactly is the problem? Once initialized, a dynamic array works the exact same as a static one.
When you loop over rows and new the columns, you can immediately loop over each column and assign values. At the next row, you can do the same, basing your values on the previous row's items (since you already assigned those values).
This is one time where defining the variable outside of main's scope might solve your problem. Variables defined outside of main (or any other function) are stored in RAM, but not on the stack or heap.
1 2 3 4 5 6 7 8 9
namespace big {
//define this HUGE variable outside of main,
//but in a namespace to avoid naming conflicts.
double arr[2000000][300];
}
int main() {
//do stuff with big::arr
}
If you dynamically allocate the array with new it will be stored on the heap. I'm not exactly sure, but I think defining all the data on the heap might slow down your program.
First, why do you want store such a huge number of values (Not a huge value)? If you want to do so, persist the info in the DB or even in a flat file. So you are using the memory in the Disc. You always get problem when you claim to allocate such large chunk of memory.
Do not go with declaring the outside as your program memory goes high and people won't accept it. Stack is the best option. But store the numbers (double) in a text file and process it.
In place your original size, use [2000][300]. And do a batch processing for 1000 times. (Use and release)
I am calculating the temperature decaying model inside a spark. Time step of the model is 1e-13 s, and I need to model time range of 200e-6. That's why I need so large array...
You suggestion is interesting. Although I can't understand all right now, I will go for the detail. Can you please explain a little bit more about what you said in your last reply. Sorry I am just a beginner in C++. I really appreciate your help!
What he means [to my interpretation]: Do you really need all 2000000x300 numbers simultaneously? If not, you could do batch processing. Read in 2000x300 numbers, process, output results, and discard. Then move on to the next 2000x300 numbers and do the same. Repeat 998 more times.
Since you don't seem to be doing anything in realtime, this should allow you to drastically reduce memory requirements (0.1% of your original plan) while keeping some economies of scale (2000x300 should be sufficient to limit the impact of any per-batch overhead), without doing any damage to the simulation itself (as you merely pause every 2000 intervals).
If you want more specific help or hints, you'll have to explain what exactly you're trying to do. I have no clue which calculations are being done, or what data is needed at which phase.
If he managed to allocate the array without having the program crash, I doubt doing the paging manually instead of letting the OS do it will make much of a difference, performance-wise. If the program does behave (or can be made to behave) in a batch-like fashion, manually swapping data in and out will make no difference. If the program needs to access the data at random then paging will definitely not make any difference.
Bottom line: if your program is running slowly at this point, either reorder your operations so you only need to access a limited range of the data, or buy more RAM. Or maybe the bottleneck is the algorithm and not the memory.
+helios, I was going to say that... Isn't the OS just paging with secondary memory (i.e. hard disk) at this point anyway. I mean, this isn't the first program to need >4GB of memory, and pretend it's all in RAM at once.
sirama wrote:
Do not go with declaring [it] outside as your program memory goes high and people won't accept it. Stack is the best option.
Mathhead20 wrote:
This is one time where defining the variable outside of main's scope might solve your problem.
I was trying to explain that normally I would say, put everything in some local scope. But let's be honest, a >4GB stack... really? How is that at all efficient? I feel "people" will understand that this was an exceptional case. (Feel free to elaborate on what you meant if you disagree; I've never personally done something like this in a (non-test WTF) program before.)