Actually it is my compiler throwing an exception "Access violation at address". It will also throw a bad_alloc exception in some other instances.
When I set reserve to a more conservative number, say 10000. There is no exception, but if I exceed this capacity, the library only allocates one byte at a time and it is a major performance issue.
I have been testing to find the max number I can send to the reserve function. I had the value narrowed down to 2^32-148897810, but then it changed on me immediately, without me changing any other settings. This sounds like a memory problem. Meaning that I don't have enough memory on my computer at that time to allocate that much. Does this sound right?
OK. I think I was right. The program was using max number of virtual memory that was available, as the string's reserved size was near 1.856GB which is near the size of my pagefile. I will try increasing my pagefile size...
Well. Changing my pagefile size to ~3GB did not change the limit of virtual memory that my program can access. The program is using almost 2GB, but it won't increase.
But if you are using a 32-bit OS, then increasing the page size won't help, because you only get ~4GB of virtual memory per process no matter how much RAM/swap you have.
But eker676's question is valid -- do you really need to reserve that much? Isn't there some smaller reasonable limit you can use?
Thanks for getting back so quick. The answer about whether I need to reserve that much for a string is yes. I am writing an application that deals with DNA sequence information and the files can get very large.
@jsmith
So if I get 4GB for a process, why am I being clamped at ~2GB. Does this have to do with my implementation?
Thanks for the link. Looks like in my situation (Win32 XP), I can get only 3GB (with the IMAGE_FILE_LARGE_ADDRESS_AWARE) and 2GB normally. This won't cut it.
Right now I have been doing tests on an 8GB file (2x4GB sections). With my current code, 4GB is read into a string to be processed.
I think my best solution is to rewrite my code to process these large sections in chunks. This would cut down the my memory requirements of my program when processing these large file. I was hoping that there was an easier way, but in the end this will probably be better.