Well you might be wasting memory during the execution of your program. If it's a game, for example, and has to allocate lots and lots of memory for meshes and textures and maps, then you need to be as careful as possible.
Consider this:
1 2 3 4 5 6 7 8 9 10
vector<sf::Image*> images;
for (unsigned i=0; i<1000; i++)
{
ostreamstream oss;
oss << i;
sf::Image* img = new sf::Image;
img.LoadFromFile(oss.str() + ".png");
// forget to add image to vector:
// vector.push_back(img);
}
Having commented out the line which adds each image to the vector, the address of each image is lost when the local variable sf::Image* img goes out of scope at the next iteration of the loop.
Now, if each image is about 1MB then that's 1GB of wasted memory, some of which might be ready for deallocation before application quit (e.g. if the player starts a new level).
I know this example is a little contrived as, once the images' addresses are lost they can't be used for anything at all, but it illustrates how one can easily get through a large amount of memory.
This is just my perspective on the situation; others may be able to provide a more complete view on the situation.
If you want an easy solution to this problem, you can bind pointers to automatic variables. Then they are automatically deallocated and exception safe.
Look up "RAII" and "smart pointers".
But a simple example would be this:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17
class ImagePtr {
sf::Image* img;
public:
ImagePtr(const std::string& filename)
{
img = new sf::Image;
img.LoadFromFile(filename);
}
~ImagePtr()
{
delete img;
}
};
// etc
ImagePtr img("bob.jpg");
// it's essentially a pointer, but the deallocation of the memory is handled automatically.