If you follow the convention of putting each class declaration in its own .h file and its implementation in its own .cpp file, you exactly know where find each.
Does it make any difference in a trivial program, no. Does it make any difference in a large system with hundreds of thousands of lines of code?
Absolutely.
When writing large programs we should divide programs up into modules.
These would be separate source files. main() would be in one file, main.c say, the others will contain functions.
* the modules will naturally divide into common groups of functions.
* we can compile each module separately and link in compiled modules
The main advantages of spreading a program across several files are:
Teams of programmers can work on programs. Each programmer works on a different file.
An object oriented style can be used. Each file defines a particular type of object as a datatype and operations on that object as functions. The implementation of the object can be kept private from the rest of the program. This makes for well structured programs which are easy to maintain.
Files can contain all functions from a related group. For Example all matrix operations. These can then be accessed like a function library.
Well implemented objects or function definitions can be re-used in other programs, reducing development time.
In very large programs each major function can occupy a file to itself. Any lower level functions used to implement them can be kept in the same file. Then programmers who call the major function need not be distracted by all the lower level work.
When changes are made to a file, only that file need be re-compiled to rebuild the program. The UNIX make facility is very useful for rebuilding multi-file programs in this way.