Hi, I've been writing a program to analyse some data from a particle counter. Basically the program just adds data and puts it into excel. My issue is that the program is crashing. I know it was written properly, etc... and I know what the problem is. When the program is run using half the amount of data, etc... it functions. The problem is the amount of data I'm 'shoving' into the program. I was wondering if there were some way to give my program more RAM access? Unless anyone noticed something else that I did wrong. Thanks ahead of time.
I believe Excel has limit on the number of rows and number of columns correct? I also assume you are running your program on a desktop PC running Excel?
In general based on your requirements, such data intensive stuff should be relegated to some back-end servers for processing and then once ready find a way to get the output back to your desktop PC.
If above is not a viable approach, I am afraid you may need to change the existing program to detect boundaries. Eg if say Excel has reached 65K row, all new data coming in will have to wrap around or remove first line and then append new data to the last line.
Thanks for the advice. I ended up learning basic VBA and making a macros to analyze the data. However I'm still interested in what you just said:
'In general based on your requirements, such data intensive stuff should be relegated to some back-end servers for processing and then once ready find a way to get the output back to your desktop PC.'
I'm still a beginner programmer so I'm not entirely sure how to go about doing this. I would love to re-write this C++ program as practice. Could you explain to me what exactly 'back-end servers' are and how to use them?