Putting function definitions in your header files is a generally a bad idea, as they will then be compiled in every cpp file containing that header, and your linker will be very unhappy. Function definitions generally live in cpp files, so that they get compiled only once, and the linker can only see one version of them.
#ifndef CLASS_B
#define CLASS_B
class ClassB : public ClassA
{
public:
ClassB(){}; // There is nothing to ClassB for this example
~ClassB(){};
}
#endif
Most definitely - I believe MSVS creates a file for you called "stdafx.h" in which you can include all your header files with then just add a "#include "stdafx.h"" in whichever file needs them.
An example of what Peter87 was saying. This won't compile, but it should:
test.cpp
1 2 3 4 5 6
#include "B.h"
void testFunc()
{
B myB;
}
If you ever need to used class B somewhere, and you don't want everything that's in your common.h file, you won't be able to use it without including A.h also. All of the declarations required to use class B should be included by B.h.
common.h might get rather large by the time you're done, and as long as you only use common.h, you won't have any issues. But let's say that you want to show someone specifically how ClassB.h works, and you don't want all the garbage in common.h. You would #include "ClassB.h" and it will fail. Like Peter87 said, ClassB inherits from ClassA, including ClassB.h will fail because it has no idea what ClassA is since you never linked it except in common.h
This sounds like it defeats the purpose of common.h.
That may be the case. I am not terribly excited about common.h-type files. They have a purpose if multiple .cpp files will be including the same set of header files, but generally I stay away from them.
Working with small projects is not a big deal, but unnecessary includes can cause recompilation headaches when projects get big. If you include a file, it is considered a dependency. If that header changes, every .cpp file that includes it, whether it's used or not, will need to be recompiled. If your whole project includes a common.h file, then every time a header changes, the whole project must be recompiled. That's not a problem when your project is small, but it can take hours to recompile large projects.
If there are a set of headers that are common to a set of .cpp files, you can put them into a common.h file. But it is generally better just to include the files you need in the .cpp files that need them.
This whole Linker/Compiler connection is something I'll have to learn. I'm taking C this summer at a new school and I think we'll get there.
To stay with the example; let's get rid of Common.h, and then give ClassB and main a cout function. Every file now needs <iostream>. When the linker goes through this, aren't I going to get 3 copies of iostream for the compiler to deal with?
The functions for iostream exist in already compiled binary files that came with your compiler/linker suite. The linker will link accordingly once and once only.
The header only declares the iostream functions, which enables the compiler to compile your code in a single cpp file correctly and effectively leave a note to the linker in each place saying "here, please call the cout function using these parameters" and the like. The linker will only have a problem if it cannot find the function you need (didn't link to the compiled binary) or if it finds more than one of any function you need.
#include <iostream>
does not cause the actual function code to be put into your cpp files, so you will not be compiling the function code, so there will be only one cout function for the linker to find (in the compiled library that came with your compiler/linker suite).