I'm trying to understand memory leaks better. Specifically, a program (numerical simulation) seems to have a small memory leak despite me avoiding raw pointers as much as possible. The thing I'm confused about is the free system memory decreases as the program runs (the leak), but doesn't bounce back up after the program terminates. I thought that even if there was a leak the program termination would release the memory back to the operating system. What am I missing here?
No, if you allocate memory yourself with new, it is not available to the operating system until delete has been called on the pointer. So if you ran your program enough times (although it would probably be lots and lots) then your computer would run out of available memory and crash.
It is available after the program terminates on every operating system I have ever used. This is usually a case of misreading or misinterpreting what the OS is telling you about free memory.
Hmm, I must be doing something really evil then, since it definitely is not freeing the memory. The big leak that (I think) I fixed was pretty nasty. In the simulation I dump the current system state to file often, via a statement like
fprintf(f, "%Lf, %Lf", a, b)
Now if somehow a and b are not actually longs e.g. regular doubles instead of long doubles, it memory leaks! Most evil problem ever. Who heard of a memory leak in a print statement.
Look up varargs (ellipsis) and all the problems associated with them. Learn them, the avoid them whenever you can.. well, unless you need them (you are on a project that uses functions with them or something), but then you'll know the problems associated with them.