new doesn't return NULL/nullptr on a failed allocation. To make new return NULL/nullptr, write new as: new( std::nothrow ) int( 10 ).
Also, if the allocation fails, and you haven't written the latter, an integer is thrown (you've explicitly thrown an int). As a result, catch( ... ) will always handle exception since you haven't provided an handler for int. If new does in fact fail, an exception is throw, therefore, throw 1 is redundant.
Where's your delete[] call? You allocate memory with every pass of the loop (which lasts forever and one when an exception isn't thrown), which is bad. However, you never call delete[] on the resource you allocated. Really, your code should look like this:
int *MyNumber( nullptr );
while( true )
{
try
{
MyNum = ( newint[10] ); // Or whatever value.
}
catch( std::bad_alloc )
{
std::cout << "Bad allocation" << std::endl;
// No need to call delete[] since the allocation failed. Loop again and try
// to allocate the memory again.
continue;
}
// The allocation succeeded. No need to loop again.
break;
}
// Continue with main( ).
@hamsterman it just means the system has no memory so everything gets bogged down so I cant do anything until my program crashes, which frees up more memory so I can do stuff again.
@Framework well i tried that too but the exception was never thrown.
and I don't delete mynum because I am trying to force the exception to be called.
I was just trying to test the darn thing. I'm learning exceptions now.
Why wouldn't I want to cause a bad_alloc?
And when I run out of memory, shouldn't that cause the std::bad_alloc to be thrown when I try to allocate more memory?
let me rephrase, or clarify.
I don't actually know that that exception is not being caught, but "Allocation failed" is never being couted to my screen. Could be because like i said my system is crashing, and there weren't enough system resources or whatever is necessary to cout 1 line, but i certainly ran out of memory.
I remember hearing that not all systems will throw a bad_alloc (or give any other measurable error condition) on failure to allocate memory. Perhaps this is one of those instances.
Don't remember where I heard that or how reliable it is, though.
Well it does appear to be strange what happens. when my computer freezes (on executing the 'memory fork bomb'), i'll try to close my program immediately, and then after about 30 seconds it will close, and then another 30seconds to a min before my computer beings to behave normally. I noticed in task manager my memory going from maxed out at 1.99GB down to near 0, then slowly growing back up to about 450mb, which is what it was at before I started my program. And I'm on windows 7. so maybe thats it.
I'm wondering if the strange behaviour is because you are allocating loads and loads of tiny little blocks of memory when the constructors of the objects in your original array of bigints are called.
Are you still allocating just a single extra int for each bigint?