1 2 3 4 5
|
catch(IOeception& ex) {
std::cout << ex.what() << '\n';
delete[] file; //wrong
return false;
}
|
cmiiw: file is either NULL or invalid (I take NULL)
If the function throws an exception it does not returns a value to be assigned to the 'file' variable, so you've got a memory leak. |
More accurately, if the function throws an exception then the assignment to 'file' never happens. Fortunately, it's initialized to nullptr so the delete[] is safe.
For the same reason, the original post's code may have undefined bahavior, depending on the value of p coming into the code:
1 2 3 4 5 6 7
|
try{
p = new int[4*1000000000];
//ok, this definitely will make my program crash.
}catch(bad_alloc ex){
delete[] p; // p was not assigned
cout << "Error: "<<ex.what() << endl;
}
|
Is it a good programming practice? |
Probably not. When you're out of memory, you have to assume that any future attempt to allocate memory will fail. I realize this isn't true for the example you gave, but in real code, it is. So
cout<<"Error:"
might fail (it might try to allocate memory for buffering) and
ex.what()
might fail for the same reason.
The only way I've seen to gracefully handle out-of-memory conditions in anything but trivial programs is this:
- allocate some amount of "emergency memory" at the beginning of the program.
- If you run out of memory, the first thing you do is free the emergency memory. This ensures that some future allocations will work.
- Save the users data and exit.