"It's for backwards compatibility"

Pages: 123
OK, common ground. You agree that a single thoughtless break in backward compatibility in the language in one spot could cause thousands of programmers additional work to fix in their own code?
Last edited on
Well, that depends. When a language changes, implementations of its previous versions don't instantly cease to exist. For the sake of argument, imagine that C++23 was completely source-incompatible with previous versions. If you wanted to keep your existing code unchanged and use C++23 in new code in the same projects (how good an idea this would be is not the point), you could use an older language version for legacy code and the latest version for new code, then link everything together. There's no reason to think this could not possibly work, since it's routinely done between C and C++.
> you could use an older language version for legacy code and the latest version for new code, > then link everything together.

Even assuming that we do have a backward compatible linker, this kind of interoperability may be limited to standard layout types ie. to types that can be used for communicating with code written in C.
then link everything together.


I'm still laughing at that........ Best joke of the day.

Since C++ was standardised, there have been several breaking changes to the ABI - and more hopefully to come to fix existing issues with the C++ standard. The best that could probably be hoped for is that a compiler can be set to compile against a specific standard for a compilation unit using that compilers current ABI so that everything links for that compiler. I only know VS2019, but the earliest that will compile is C++14 - although you can specify C++14/17/20/latest for each compilation unit. So it sort of possibly does this. What you can't do, though, is mix old with breaking new. If you want to say add a breaking C++23 change to say C++14 code unit, then the whole of that unit will have to be C++23 compatible.
this kind of interoperability may be limited to standard layout types ie. to types that can be used for communicating with code written in C.
"May be", sure. "Will be"? Not necessarily.

seeplus: I'm not sure what you're getting at. Yeah, in this hypothetical scenario you'd have to rebuild your project. Until there's any kind of ABI standardization (either from the ISO committee or from other groups) you won't be able to mix-and-match object files from different compiler versions. But this has nothing to do with the language versions you're compiling, and everything to do with your compiler vendor not guaranteeing a stable C++ ABI. You could compile two C++14 translation units with two different versions of GCC and run into the same problem. They'll link (maybe), but they won't interoperate.

As I was saying, yeah, you'd have to rebuild your project, but you wouldn't need to make any code changes, which was my point.
What I was referring to is that if you compile one unit as say C++98 and another as say C++23 then depending upon how things are done by the compiler, will the C++98 code be compiled using the C++98 ABI used at that time or the latest ABI used for C++23? If compiled using the C++98 ABI for that version, then they might not link OK.

This is all fraught with issues. You'd need the version of the standard library for each version as well.

I guess this is why the C++ Standards Committee try to avoid breaking changes to the language spec.
if you compile one unit as say C++98 and another as say C++23 then depending upon how things are done by the compiler, will the C++98 code be compiled using the C++98 ABI used at that time or the latest ABI used for C++23?
Your error is in thinking the ABI is tied to the language version, rather than to the compiler version and platform. Compiling C++98 code with a compiler in the current state of standardization would not use the ABI the compiler used to use at the time C++98 was current. The compiler would use whatever ABI it's using currently to support both C++98 and C++ 23.
Why would it be any other way? What's more important? The ability to interoperate of two pieces of code being compiled right now, or that of two pieces of code being compiled decades apart?

If we're talking about a hypothetical scenario where an ABI has been standardized at some level then I have no answer for you. It depends on how the ABI is specified and precisely in which way they're changed from one version to the next. As I was saying yesterday, a breaking grammar change doesn't necessitate a breaking ABI change, nor vice versa.

You'd need the version of the standard library for each version as well.
This is already true. A compiler that allows specifying the language version through a switch must hide or show different library features depending on configuration. If there was a significant breaking grammar change, all it would mean would be that the library would need to be reimplemented.
@helios. Thanks.
Ganado wrote:
- auto_ptr was still around during C++11, but was marked as deprecated and removed in C++17.
- Trigraphs have been in C++ since forever, but were removed in C++17.
- throw() has been deprecated since I think C++11, but was finally removed in C++20.

What??? But...I've been using auto_ptr and throw() for years with no issues! And I recently wrote a program that used some custom exception handling and it compiled as C++20 with only a couple unrelated warnings.

$ c++ -std=c++2a -Wall -Wextra -pedantic str_exceptions.cc -o str_exceptions
In file included from str_exceptions.cc:17:
./str_exceptions.h:57:18: warning: comparison of integers of different signs: 'int' and
      'std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> >::size_type'
      (aka 'unsigned long') [-Wsign-compare]
                if (k < 0 || k >= str.length ())
                             ~ ^  ~~~~~~~~~~~~~
str_exceptions.cc:24:15: warning: unused parameter 'argc' [-Wunused-parameter]
int main (int argc, char const *argv[])
              ^
str_exceptions.cc:24:33: warning: unused parameter 'argv' [-Wunused-parameter]
int main (int argc, char const *argv[])
                                ^
3 warnings generated.

How does that work? I know that "deprecated" means that it just isn't used anymore, but "removed" means that it isn't even supported anymore, correct?

[Edit] Sorry, I got onto the thread kind of late, but I really am curious.
Last edited on
> The compiler would use whatever ABI it's using currently to support both C++98 and C++ 23.

This may not be possible without some degree of backward compatibility. For instance, for some non-standard-layout class, the same class definition may have completely different semantics resulting in completely different object lay outs.

Restricting the interoperability to communication with standard layout types would still be possible; unless (admittedly, this is a far-fetched example) the new-fangled non-backward-compatible specification makes creating standard layout types impossible; for instance, if it insists on a Python-style object model.

In reality, a fair degree of backward compatibility between versions of C++ would always be present.
auto_ptr was deprecated in C++11 and removed in C++17. If you're compiling code using auto_ptr as C++20 that compiles, then that is not standard C++20 code.

https://en.cppreference.com/w/cpp/memory/auto_ptr

Re throw(). Throw() had/has several uses. The one(s) that were deprecated in C++11 and removed in C++17/20 relate to dynamic exception specification for a function. This used to be allowed:

 
void f() throw(int);


meaning that function f() might throw an exception of type int

 
void f() throw();


meaning doesn't throw. Same as noexcept

https://en.cppreference.com/w/cpp/language/except_spec
Last edited on
Well, I usually just use C++11 because that's what I learned, but I've been compiling it as C++20 (-std=c++2a) and it seems to work fine.
The GCC website says you should use -std=c++20, and -std=c++2a is used in GCC 9 and earlier. If the compiler isn't feature complete for C++20 yet, my guess is that the developers are not quite at the point of breaking compatibility.

Or is your c++ an alias for clang++? I'm less familiar with clang. But their site basically says the same thing: "You can use Clang in C++20 mode with the -std=c++20 option (use -std=c++2a in Clang 9 and earlier)."

Microsoft's history with throw(...) is even weirder.
See The sad history of the C++ throw(…) exception specifier: https://devblogs.microsoft.com/oldnewthing/20180928-00/?p=99855

Compilers may allow throw() as an extension even once they complete C++20 implementation.
Last edited on
unless they break macros, overloading, and other roots you can fix some of it too. Eg if you have auto-ptr a billion times in your code base, a #define or a little class or whatever you need to make it now be a unique ptr or shared ptr or whatever would let you keep using the code. Its ugly, but I have seen plenty of code bases with little hacks that define something (standard or not!) that the code needed but is now MIA.

auto ptr is a bad example there --- pretty sure most compilers will allow it and throw just a warning and still support it. But there are things that are gone or were for a specific OS or whatever eg getch() which you can recreate with an inline assembly command or something ... I know we had at one time a gotoxy rebuilt from ncurses or something along those lines.
Last edited on
Yeah, it's clang++. I think it's version 10, though?
$ c++ -v
Apple LLVM version 10.0.0 (clang-1000.10.44.4)
...

Ugh. Yet another reason I don't like Microsoft. That's not to say Apple doesn't have its problems, but I've found that for programming, even my old Power Mac G5 is better than the crummy PC's they make us use at work. I even offered to bring in my own computer once, but the manager started griping about some compatibility BS, with Word and Visual Studio and Excel not working with "Apple's sh*t." I tried telling him that those applications all suck anyway (IMO), but he wouldn't buy that. Oh well.

@jonnin,
I haven't used #define a whole lot, except for disabling assertions after debugging (#define NDEBUG). But that's a good point, I have some code from a friend of mine at work that has a ton of #define stuff, and it allows him to use a bunch of things that were deprecated/removed. He also wrote his own custom headers because he didn't like the standard ones.
a friend of mine at work that has a ton of #define stuff [...] He also wrote his own custom headers because he didn't like the standard ones.
I think we all know a guy like that. I used to work with a guy who used Hungarian notation on everything, used __in__ and __out__ annotations on parameters, never used references, and wrote gigantic functions. Perpetually stuck in the early-to-mid '90s, in other words.
Hungarian notation?

It sounds kind of like Reverse Polish Notation, but for computer programming. I mean, the sort of thing that nobody uses anymore except weird old guys like me who still use an HP 32s II from the 1970's (which still works, by the way).

I do occasionally use __FILE__, __DATE__, and __TIME__, just for recording compilation time and file location. But maybe that's not what you mean?
It sounds kind of like Reverse Polish Notation, but for computer programming.
Nope. It's completely unrelated. It's the practice of adding "warts" to identifiers that are indicative of the identifier's type. It comes from the days of lame IDEs that couldn't show you the type by just hovering over a name.

I do occasionally use __FILE__, __DATE__, and __TIME__, just for recording compilation time and file location.
Again, no. They're macros that expand to nothing used to document whether a function parameter is input or output. They're redundant in idiomatic C++ because you can do the same with const.
I meant it was kind of similar in that only the odd person used them. But I guess that could come in handy...actually, since I don't use an IDE, that could come in really handy!
@agent max
... even my old Power Mac G5 is better than the crummy PC's they make us use at work. I even offered to bring in my own computer once, but the manager started griping about some compatibility BS, with Word and Visual Studio and Excel not working with "Apple's sh*t." I tried telling him that those applications all suck anyway (IMO)
With you all the way on that ...

@helios
I used to work with a guy who used Hungarian notation on everything, used __in__ and __out__ annotations on parameters, never used references, and wrote gigantic functions.
I feel your pain. I worked with one guy who named all his parameters a, b, c, ..., and reused them within the function, all on which were huge incomprehensible blobs. We ended up using code samples from him in interviews and asking, "What's wrong with this?". This was in the 90's.
Last edited on
Pages: 123