I recall when I first started playing with C++ I was told that you should never use virtual functions unless you absolutely cannot think of a better way to do whatever you are attempting. This is something I have tried to stick to over the years - and indeed is probably why I have never used inheritance or polymorphism much in my own programmes.
However, I notice through a great deal of the code examples offered to questions here and even over on StackOverflow that commentators show no hesitation to recommend code that involves virtual functions. More so, I have even seen several instances here where - what I was taught as, but they may well have a different official name - 'pure virtual functions' (those with definitions inside a class of something like virtualint function_name(void)=0) are demonstrated and I was very clearly taught to avoid those like the plague.
I was wondering therefore has the official thinking changed since the middle nineties on when - and even whether - to use virtual functions in your programmes?
'pure virtual functions' (those with definitions inside a class of something like virtual int function_name(void)=0) are demonstrated and I was very clearly taught to avoid those like the plague.
Pure virtual functions are no more expensive than usual normal virtual functions. I do not know why you were taught that.
I was wondering therefore has the official thinking changed since the middle nineties on when - and even whether - to use virtual functions in your programmes?
More competent people who knows about danger of premature optimisation and that there is no silver bullet?
Larger CPU cache size, branch prediction and code prefetch made cost of virtual functions almost neglible except in specific cases?
More advanced compilers, greatly optimising resulting code?
Constantly evolving language?
I say, if you do not want to use virtual functions, you do not need C++. C is almost everything you need.
Many things are not possible (without excessive additional work) without use of dynamic dispatch and polymorphism.
virtualint function_name(void)=0) — bad style. I say, whoever taught to write like that, tries to use C++ as C. Which is copletely wrong.
Strange, I would say mid-90s was the time when virtual functions were used more than ever, and were considered to be the final solution to every problem: generic programming was new, and static polymorphism wasn't widely known. It's now that we have alternatives.
I am REALLY enjoying our conversations MiiNiPaa!!! I am glad you were around to answer this one - although I worry I am dominating your time here on the forum!
As I have alluded I never received any formal programming education - or for that matter computer teaching really as hardware was extremely scarce in British comprehensive schools in the seventies and eighties. Eventually I did enrol on an undergraduate programming BSc in the middle nineties at my local university, but family problems forced me to drop out almost before I began - although I obviously kept the course books I had already bought. Combined with about two months of lectures they are where I learned what little formal teaching I have received. Programming is a complete hobby for me, but I believe that just because I am not paid for it does not mean I cannot become good at it!
It is interesting you make the point about 'C' as I was taught that 'C++' should only be considered an extension to 'C' and to VERY sparingly make use of the 'ease-of-use' features it provided. Critically it was said that a 'C++' programme must run well as a 'C' programme first and foremost. Anything the 'C++' language might bring to the pot - such as most obviously classes - would only slow it to a crawl. The terrible performance of 'Visual Basic' was pointed to as an example of when the large scale use of class-based programming becomes ridiculous.
It might make you smile to learn that I was also taught to allocate ALL objects and arrays on the heap with 'New' or 'malloc'. Never, EVER use more than 10-12 double-word stack variables in any function, even if they are only local and will go out of scope at the closing '}'. The point I believe was that it would be too easy to use up all the stack on local variables unless you allocated to the heap.
However, so far as the matter at hand goes I guess you would say - 'don't worry about using virtual functions'! As I recall the supposed problem was that employing virtual functions requires a 'vTable' to keep track of the various overrides, whose creation, maintenance and continuing reference against is extremely costly in terms of processor time.
You mention virtualint function_name(argument_list)=0 is bad form. What would be the best way to define a pure virtual function?
Do you know what vTable is? If we extremely simplify it, you can say that it is an array of function pointers.
Basicly each time when you call a virtual function: foo.bar(), you are calling it through one level of inditrction:
1 2
foo.__vtable[c_bar]()
//table pointer↑ ↑some compile time constant
There is one vTable per class and every instance of the class share it. Addition of virtual functions usually increases size of your class by size of one pointer (pointer to vTable). I would hardly call tht costly. Its overhead is comparable to overhead of using pointer to heap-allocated array.
You mention virtual int function_name(argument_list)=0 is bad form. What would be the best way to define a pure virtual function?
I mentioned that form foo(void) is bad. In C++ you should use simply foo(). It was C where those two forms had different meaning. In C++ they are equivalent.
It is interesting you make the point about 'C' as I was taught that 'C++' should only be considered an extension to 'C' and to VERY sparingly make use of the 'ease-of-use' features it provided.
It would make sence in mid-80 to early 90, when C++ was introduced, not when it became robust and standartised programming language.
I guess there is a level of dogmatism at work among any establishment. Presumably the chaps who - tried! - to teach me were still working in their minds with mainframes and the early, underpowered x86's.
I am quite interested with the concept of 'premature optimization'. To an extent then I imagine it is best to write your code with the weighting on clarity rather than a perceived improved run-time speed? Modern compilers are supposed to be very good an optimizing their source code.
I'm guessing that just using empty brackets in a C declaration meant the function had one possible 'int' argument you hadn't bothered to specify because C uses an implied 'int' argument type - coupled to the fact you didn't need to actual specify names for your arguments in the declaration? Therefore 'void' made it clear the function had no arguments. This in contrast to C++ which has no implied return type I think, so you have to explicitly define the type and presence of all variables/arguments?
I'm guessing that just using empty brackets in a C declaration meant the function had one possible 'int' argument
It meant that it could take any number of arguments.
C uses an implied 'int' argument type
Used. Long ago. Now implied int is illegal in C too.
To an extent then I imagine it is best to write your code with the weighting on clarity rather than a perceived improved run-time speed?
Yes. You should write code for people who will maintain and extend your code, not for machine: it is compiler job.
"Always code as if the guy who ends up maintaining your code will be a violent psychopath who knows where you live."
Computers are complex. All your predictions about what would take most processing power are likely would be wrong. You should profile your code, find bottlenecks and only then optimise your code.
By the way, another thing that changed since the 90s is the amount of devirtualization techniques implemented in the compilers. One of the gcc maintainers recently posted an interesting blog series on that subject: