Every compiler has its idiosyncrasies and variations from the standards. Some are bugs and some are deliberate.
I'm using Dev-C++ as a compiler, as that's the only one someone recommended to me a few years ago... The name seems to suggest it is a C++-compiler? |
I've never used it, but lots of people do. I would say the environment
does use a C++ compiler. Just because it has the odd breach, doesn't make it something else.
I tried some numbers for p. 500,000 seemed to work (I just started my OS, and thus there were quite some 0's printed. Was quite funny :P ). With 5,000,000 the program crashed. I couldn't find something like a typical stack size (windows XP), but I found some talk about '64 KB of size'. I think it could have been that number? |
I guess you can run out of stack. The compiler doesn't know how large the stack is. It's set by the linker. On Windows, the stack grows dynamically. You can trap a stack fault when the stack is being grown if you want to, but there's usually no point in doing so.
Anyways, let's have a look if I get it: Normal variables are allocated from the stack, some space that's reserved for the program. Dynamic Memory variables are allocated from the heap, some space that's not primarily reserved for the program, but can be if asked for.
Right? |
Automatic variables are declared on the stack. In the B Programming Language, you had to use the auto keyword to declare a stack variable, it's optional in C and the latest C++ standard uses the keyword to mean something else.
There's a seperate area for global variables and static data. And there's a read-only area for static strings. So far, all of these are static allocations and all this memory has to be available before the program can be loaded for execution.
As you add more RAM to the computer, more memory is available to programs. In C and C++ this is available in the heap. The idea is to ask for varying amounts of memory and the return it when done. How much you can ask for is dependent on the size of the largest contiguous block in the heap, the allocation is either satisfied or rejected (you get NULL back).
And the stack size is the amount of memory you have available to your program for your normal variables? And this is a standard amount? So whenever you're writing a program that's going to use more than memory than the stack size, you'll have to turn to dynamic memory? Or can you alter the stack size in some way? |
The stack is a small amount of memory set aside by the linker. It's a static size and must be available before the program can be loaded. It isn't really standard as you can tell linkers how large to make the stack, but it's fixed to beging with. Some larger systems like Windows can grow the stack dynamically, but as you've found, there's still a limit it can grow to even though it can get rediculously large. Automatic variables are placed on the stack, there's alloca that can make an allocation from the stack, such an allocation is released when the function exits as the stack is pop'd. It's this alloca that's used by C99 to create dynamic arrays on the stack, and this behaviour that GCC allows in their C++ compiler by default. But we've provided links twice previously on how to make GCC warn about deviations from the standard.
You don't turn on dynamic memory, the heap is always available to a C/C++ program. You don't grow the stack, if the environment supports it, it grows it for you. If the environment cannot grow the stack, your program will crash at run-time.