I checked the implementation of realloc() in glibc and it does, in fact, prefer to grow the current block, eschewing the copy step, over allocating a new block elsewhere and having to do the copy. I suppose in a trivial program where the heap state can practically be known, this would be an optimization. In practice, since C++ is used to build simple 5 lines programs and 10,000,000-line word processors alike, I doubt that it would provide much benefit due to the dynamic nature of the heap.
But, really, if you feel that new/delete are bad, then rocketboy you need to forget C++ and stick to C.
new and delete are good. new[] and delete[] are bad. If there's no chance of renew[] existing, why not make std::vector (and std::string) built in to the language allowing massive optimization?
Growing the current block is always a good optimization, because if I have a huge vector of std::string that I'm handling (deleting, resizing, concatenating), it can make things much faster.
why not make std::vector (and std::string) built in to the language allowing massive optimization?
You could make that statement about virtually every library for every programming language.
Yes, I agree that growing the current block is a good optimization. It's just that I doubt it is an optimization that can be employed frequently enough to make it worthwhile. std::vector<> already provides a reserve() method which is intended to be used to reduce the number of reallocations needed. It works great if you can put a reasonable upper bound on the number of elements the vector will contain. If you can't then there is std::deque<>, which gives you about the same runtime efficiency as vector<> but avoids the reallocations. It works great as a replacement for std::vector<> when you can't put a reasonable bound on the number of elements.