It is much easier to read a program as a beginner when the class prototype and definition are combined into one. In professional programs do you need a separate prototype & definition, or do you see a mixture in the wild? Seems like a lot more work to do both and that it can clog up the code really quickly?
It is much easier to read a program as a beginner when the class prototype and definition are combined into one. In professional programs do you need a separate prototype & definition, or do you see a mixture in the wild? Seems like a lot more work to do both and that it can clog up the code really quickly?
For entities that are intended to be used by other code, you want the declarations - which is what function prototypes are - in a header file, so that they can be easily included by any source files that need to make use of them.
Generally, you don't want to put the definitions of those functions in a header file, because that would create multiple definitions of the same function, which is illegal. (There are tricks to get around this, but unless you have a really good reason to do so, you shouldn't do this.)
For entities that are intended to be private to one source file, it's fine to simply have the functions defined at the top of the file, as long as you don't have circular dependencies which would necessitate a declaration separate from the definition. Ideally, you'd put them in an anonymous namespace, to make absolutely sure no other code can use them:
From what I've seen it is alwaysvirtual return_type function, NEVER return_type virtual function. What happens when you declare your destructor as virtual?
You do it your preferred way for declaring virtual class methods and you will confuse a lot of people.
Separating out class methods into declarations and definitions are usually done so a user of the class doesn't know the inner workings of the methods, all they need to know is the interface.
Class declaration(s) are (usually) put into a header file, the definition(s) are crammed into a source file. .h/.cpp.
Defining a class method as part of the class declaration tells the compiler the method is to be treated as inline.
Whether the compiler actually does make the method inline instead of adding code for a function call is left up to the compiler. Modern C++ compilers do optimization very well.
> Both of these seem to work and I prefer the last one, can either be used in professional programs?
> virtual void DoubleValue() const; // 1
> void virtual DoubleValue() const; // 2
AFAIK, // 1 is the form that is generally favoured. (Perhaps also more logical, since the keyword virtual does not affect the type of the member function)
> In professional programs do you need a separate prototype & definition, or do you see a mixture in the wild?
We see a mixture. Inline functions and templates are often defined within the class.
It is much easier to read a program as a beginner when the class prototype and definition are combined into one. In professional programs do you need a separate prototype & definition, or do you see a mixture in the wild?
If your compiler supports C++20 you can forget about header files. In modules you put all code in one file.
If your compiler supports C++20 you can forget about header files. In modules you put all code in one file.
The C++ standard library headers can be imported as modules in C++20.
import <isotream>; vs. #include <iostream> .
Still have to include the C library headers, though. For example, #include <cmath> .
Admittedly I haven't really done a lot of research/learning about C++20 modules until very recently. After purchasing "Beginning C++20: From Novice to Professional". The examples I had read online before buying the book used import std.core;. All the code examples in the book use the importation of the pre-C++20 headers format.
That one "little change" in consuming C++ standard library modules makes understanding modules a whole lot less confusing.
C++20 is nearly as revolutionary for C++ as C++11 was.
I meant to say that I prefer the first one (not the last) and that I saw the other variant and that it bothered me.
virtual void DoubleValue() const; //The one I like
I did not know that when combining prototype/definition that by default the compiler tries to inline the method. Very good to know! I saw one small example of headers and I have to get used to them. I had no idea about modules, even more to learn.
From what I've seen Visual Studio 2019 & 2022, according to that cppreference chart, is currently the only compiler suite that has 100% C++20 support. Though to enable that support for several parts requires using -std:c++latest, not -std:c++20.
The closest compiler for C++20 support is GCC with a couple of items listed as "partial." Modules is one that has only partial support.
I've tried to use GCC (command-line via MSYS2) for C++20 module usage and get errors up the wazoo.
That's (import std.core;) non-standard
Be that as it may, I did state I hadn't until recently spent much time mucking around with modules. The few online references when I was were ones I had found (and read) that are MSDN/MS. So my experience with the exact usage was less than perfect. That is why I was somewhat surprised and amused to see the actual format was not the typical MS "let's do something non-standard."
In the VS IDE there are a couple of solution/project properties you need to modify to get modules working.
1. Project Settings -> C/C++ -> Language -> C++ Language Standard -> set to Preview - Features from the Latest C++ Working Draft (/std:c++latest)
2. Project Settings -> C/C++ -> Advanced - > Compile As -> set to Compile as C++ Module Code (/interface)
3. is not needed now if you've set the advanced setting at 2...Project Settings -> C/C++ -> Language -> Enable Experimental C++ Standard Library Modules -> set to Yes (/experimental:module)
Though setting all 3 together won't hurt.
VS Intellisense gets loopy from time to time, though. Expect it occasionally to say there are errors when there aren't any. Even after a successful build. ::shrug::
VS and Intellisense gets royally lost if you use the non-standard module imports like import std.core;. Just. Don't. Use. :D
The module combines the header & the source file .h and .cpp files, I must try this at some point as it seems a lot more intuitive and faster.
So much more work to have them as separate files, the compiler should be programmed and smart enough to separate the merger if need be.
Seems to me C++ is very good and fast, but not very well thought out. Do you think there will be another language that is as fast as C++ and without all the nuances? C# is the closest we can come to it but still a lot slower?
Seems to me C++ is very good and fast, but not very well thought out.
C++ is not a "do it once and done" language. It was adapted from C and has grown and mutated over the years and decades since.
C# will likely become as convoluted and "not well thought out" as C++ as MS mutates the language.
The major difference is there is pretty much only a single source for C# changes (MS) whereas C++ changes are done by an international committee.
C++11 was a major change to what had been C++ before then. It is considered "Modern C++." c++20 is the 2nd massive change, after C++11.
A goodly portion of the supposed deficits of C++ is backwards compatibility. What was C++98 can still be (for the most part) compiled with a later language standard. There are a couple of things that have been deprecated/removed as newer standards are approved. std::random_shuffle in <algorithm> for instance. https://en.cppreference.com/w/cpp/algorithm/random_shuffle
I think quite a few languages - Java, C#, Python - became convoluted over last decade.
The good thing is that you don't need to know and use everything.
"Some people claim that Rust is as fast as C++, but I haven't seen any examples."
The trouble is valid holistic tests. Even vanilla python can keep pace with c++ on specific tasks. When you throw a battery of different things -- disk, math, strings, network, memory management, watever -- c++ consistently comes out at worst equal and rarely loses to something off the usual path like fortran (its not going to lose to interpreted). Also a lot of the tests, the coder is not a grand master at both languages, so one of the programs will be expressed poorly. It is rather hard to set these up to do it 'right'.
I am not sure c# is that much slower most of the time. Its pretty good.
c++ almost never tells you 'you can't do that'. Almost every other language does. That alone is worth some of the aggravation. C++ also never says "do it this way or else". C++ is catching up, learning new tricks from its grandkids every few years; this year marks another big modernization upgrade. It will be there when the latest fad language is long forgotten.
I don't know Rust, but if it compiles to native machine code, then it boils down to what assembly is produced. And how heavy is the runtime that goes with the code? (In C++, it's minimal or non-existent. In C# and Java, it's very heavy.) Because compiler optimization is a large part of what makes modern C++ fast, it could be that other projects just don't get the love and support that big projects like gcc/clang get, so they will inevitably lag behind in optimization.