So a virtual function entails a function pointer lookup every time said function is called.
I generally like to go by the rule "it's not a performance problem until it is", but generally speaking, if I had to call an overridden virtual function a million times, would a non-virtual alternative have a significant performance difference?
I suppose I could make a test program for this. Just thought I'd ask in case anyone has prior experience here.
The overhead of the vtable look up is just a couple of machine instructions.
If the extra overhead was just those two instructions for the vtable look up per call, there would not be a perceptible performance difference between all the calls in a non-trivial program being virtual vis a vis they being non-virtual. In practice, the virtual dispatch mechanism would actually be faster if switch-cases were to be used as an alternative.
The major performance overhead of virtual functions stem from the fact that the function to be called can't be determined at compile time. A non-virtual function is inlineable, a virtual function is not; and if the function body is small, the difference can be significant (for example, std::sort() would be an order of magnitude faster that std::qsort() for typical comparison predicates). Even if the function is not actually inlined, if the compiler can see its definition, optimizing across the function call is still possible, because the compiler knows exactly what the function is doing.
Well written code that is pervasively const-correct, has few objects at namespace scope with external linkage, will typically perform a lot better in the presence of lot of virtual functions in comparison to C++ code written as if the programming language was either Java or C.