Then you'd just repeat the Perl 5/Raku split. You'd have a "new" C++ that isn't actually C++ but it's called C++ with a different version number and people will then bicker for a few years until they realize that the "new" C++ isn't actually C++ and that the "old" C++ just vehemently refuses to die, so they name it something else.
The C++ standards committee can standardize a backwards-incompatible "new" C++ for all I care as long as they don't call it C++.
> The C++ standards committee can standardize a backwards-incompatible "new" C++ for all I care as long as they don't call it C++.
This x2. It's only C++ if you can optionally refactor random lines of code in a project written in compliance with C++98 using new features and still get a valid program.
Once you mess with the language in a way that your C++98 code is either not supported or requires rewrites or redesigns to comply with the new standard then quite obviously it is not C++ anymore.
So there is a path, but it has to take an upgrade-path into account for big legacy projects.
In practice, moving to a new compiler can be a big project for companies with larger code bases. Sensible deprecation of features could (and in practice) is part of that.
Even though the standard is quite careful, in practice big C++ projects sometimes rely on non-conforming behavior of the specific compiler version they use. An example is the MSVC template support which allowed constructs that the standard didn't.
I think this idea of "no broke compatibility"/"no rewrite" is never properly considering the presence of COST * TIME.
In short time, is bad.
The more time pass, is good.
The cost of STUPID C/C++/JS behavior is measured in the billons.
Is not the makers of this langs aware? Yes. They know how could be fixed? Sure. Then WHY is not fixed?
Because not considering time in the calculation (and how many MILLONS OF PEOPLE are affect). But today, are DECADES behind of 100% certainty of the cost of this mistakes, and proved beyond doubts that the langs that fixed them ARE better.
In the face of reality and facts, why resist so much?
---
The thing is HOW get out of this mess?
The big irony is that the JS world show how (but not commit with the proper force): Build transpilers. Make "Better C++" that transpile to "BAD C++". Make clear that everyone must move forward, but provide this as partial step.
Make auto converters. Fix damm stupidity like dangling IFs, (seriously some stuff is not brainer). Have a clear vision in how move forward.
And drop the ego. C/c++ not need to turn into Rust, but why believe it could not truly get better?
Now, in the case of C/C++ exist a big trouble in the case of being the "de facto" ABIs for the rest of the world. Apple have the same issue with swift -> ob-c with the case of nullability (which apis never return null even if in theory could?).
Them do annotations to the APIs. This could allow to mechanize the transformations and provide ABI translations that could be injected in the compiler.
Or you would have the Ethernet situation: a new better tech that is called Ethernet while having mostly nothing to do with the old one, and people migrate to it, and the old tech is eventually not used anymore.
> Then you'd just repeat the Perl 5/Raku split. You'd have a "new" C++ that isn't actually C++ but it's called C++ with a different version number and people will then bicker for a few years until they realize that the "new" C++ isn't actually C++ and that the "old" C++ just vehemently refuses to die, so they name it something else.
Well, just FYI the "new" C++ is not the "old" C++ anymore. Languages "evolve". Try to compile an old program on the new compiler. The same is valid for other languages (C, fortran, perl).
> The C++ standards committee can standardize a backwards-incompatible "new" C++ for all I care as long as they don't call it C++.
They already do this. For C for example you have C89, C93, etc.
It'll most likely be abandoned in droves. One of the main reasons to stick with C++ (and C) is that the investment you put in the code you're writing right now, no matter how fugly it is, will be there and keep working in the future.
Also:
> I think it also makes sense for C++ developers whose projects are relatively short-lived, e.g. some games.
There is a difference between games and engines - the games might be short lived, but most C++ engines are very long lived, going back decades (e.g. see how the recent Half-Life Alyx can trace its codebase back to Quake 1 or pretty much anything based on the Unreal Engine going back to the mid-90s - there are screenshots of UE1 running on what appears to be either Windows 3.1 or NT 3.51). Even when "engines" are being made "from scratch" this is often a marketing gimmick and the "from scratch" was just renaming the engine because it got a new rendering model and people can't tell the difference. EVEN when this isn't case, often new engines are based on existing code and/or libraries.
> but most C++ engines are very long lived, going back decades
While true the engines are also staffed and can absolutely keep up with changes. They regularly are tasked with new toolchains, platforms, OSes, and APIs to leverage already. This would just be more of the same, and if you look at the goals of the paper: http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2020/p213...
That lines up pretty damn well to what game engines want as well.
> One of the main reasons to stick with C++ (and C) is that the investment you put in the code you're writing right now, no matter how fugly it is, will be there and keep working in the future.
The things they talk about actually abandoning are things like byte sizes other than 8-bits or source code in file systems that don’t support file extensions or nested directories.
Chances are any existing C++ codebase is already not compatible with such systems that they want to more formally deprecate/abandon in the first place.
Agreed, I think looking at the Python 2 and 3 migration catastrophe gives a glimpse into what this could look like, but I imagine it would be much worse given the types of (large) projects that are backed by substantially "dated" C/C++ code.
> One of the main reasons to stick with C++ (and C) is that the investment you put in the code you're writing right now, no matter how fugly it is, will be there and keep working in the future.
Except slowly people will be writing parts of it in Rust, or they would be using libraries written in Rust (because availability), and then after a while people get annoyed of using more than one language and the whole project will be rewritten in Rust. Of course, at that point, a new language will be invented and Rust code will be looked at as rusty :)
If C++ abandoned backward compatibility, then this C+++ would compete with modern languages like D, Rust and it's absolutely not clear what the winner would be. As long as C++ is backwards compatible, it have tremendous advantage of having billions of LoC which don't have to be rewritten.
Some here are saying "Why not just use Rust/D/Zig/Nim/Crystal if you're going to break backwards compatibility?" I believe the proposal is closer to "We've removed this old feature, run this tool to replace it with the new one." C++ will keep looking like C++, not like Rust. Here's Google's "Codebase Cultivator" explaining the idea for their Abseil library:
Per Hyrum's law, someone will rely on any given behavior. So, if you choose backwards compatibility, it becomes harder for the committee to evolve the language over time.
The path of breaking changes can be made less painful, and does not, necessarily, invalidate all the code that is already written.
With that being said, the problem right now is not the decision of keeping backwards compatibility or not, but the fact that the standard should be explicit about it, so people know what to expect.
Yeah there's a lot of discussion here as if the proposal is some radical breaking thing when in reality it's stupid stuff like "stop pretending a byte isn't always 8-bits" or "if you're a platform that's still shipping a 10 year old compiler, you're not supported anymore in the latest language version." Problems that you'd never even dream of in most other languages.
Because C++ is not a single implementation language, rather driven by a standard with multiple implementations.
Such a solution is only possible if driven by the standard, otherwise it will never be fully available, just like the static analysis tooling varies by vendor.
Unless you manage to integrate this directly into the compiler stack and have it work 100% of the time with no "ifs" or "buts" I don't think it'll work.
Maybe you could do like Rust with their "Epoch" system that lets you interoperate code using various standards in order to migrate progressively without breaking backward compatibility. I suspect that it would be a lot harder to make it work for C++ however, mainly due to its extreme reliance on #includes (especially for anything using templates) and more common use of macros.
I'm not saying it's impossible but I suspect that it would fragment the ecosystem quite a bit. Removing "old features" tends to have massive side effects in a language like C++ with metaprograming, overloading, multiple inheritance, unlimited macro usage and complex symbol resolution rules.
So I think "why not just use Rust/D/Zig/Nim/Crystal" is warranted feedback for these proposals (and you could probably add Go, C#, Java and a few others).
Sometimes I wish Google would just stop participating in open source and standardization. The proposal actually suggests supporting Fuschia at the language level, but not any of the BSDs, and replacing the ABI with protobuf (or similar).
Much of this proposal would create insurmountable engineering problems for everything except the Google monorepo, or similar infrastructure at $1T companies building monoliths.
Historically, most innovation has happened outside of those environments.
Sabotaging the toolchains of smaller shops is not going to pan out well for the industry at large.
They list an "initial non-exhaustive list of platforms" which includes "OS/kernel." I don't think it's fair to say that they are excluding the BSDs even if they aren't explicitly mentioned. The point is that new C++ shouldn't need to support old and historical platforms.
Many people don't realize just how conservative C++ is (the OP only briefly mentions this). The committee not only avoids breaking existing source code, but even binaries, although binary compatibility is not defined in the standard at all.
C++ 11, 14, 17, 20, and 23 are all binary compatible (modulo some possible oversights). There are many quality-of-life improvements that got voted out because then a binary compiled in C++11 wouldn't be compatible with one built in C++23 anymore.
As a simple example ([1] has many more): the only reason we have both std::scoped_lock and std::lock_guard is because changing the definition of the latter would have required re-compilation of existing code, so a new class had to be introduced.
ABI stability even overrides performance in committee decisions (see the std::regex example, but I also heard it affects the footprint of unique_ptr - edit: found it [2]).
tl;dr breaking source code compatibility seems way off if implementors don't even dare to break binary compatibility.
There would then be two C++ languages. The overwhelming majority of users would stick with the old one, a scant few would take up the new one, and the rest would move on to some other language.
I'm currently working on a C++ project that has been taken out of cold storage and when the old rewrite question pops up the tech stacks considered for the job do not feature C++ at all. The main reason is that nowadays there are better tools and frameworks to implement non-performance critical parts of an application such as the GUI, which incidental takes over more than half the total lines of code, and the rest can be implemented easily with small support services developed in whichever tech stack you choose. Thus C++'s "jack of all trades" trait is no longer a major factor in the decision. Meanwhile the lack of a stable ABI starts to feature prominently as a reason not to adopt C++.
For this to work you would need stable bridge for calling between old and new version, and with c++ lacking stable ABI and high dependency to header-template interfaces (which is part of the bad c++ one would want deprecated), this could be an issue. Sure, extern C already exists but it's a leaky abstraction and it won't be added to existing libraries overnight. Without this you can look at python 2 to 3 migration to see what will happen.
I’ll never understand the portion of the C++ community that insists on modernizing the language to the detriment of everything else. When is enough enough? Just let the language be and focus on improving the standard library, not changing language features.
You are missing the point. Almost all of this has to do with the binary compatibility of the std library, not the core language. Things like unordered_map/map cannot be improved because the binaries would not longer be compatible. lock_guard could not go from one mutex to one or more because it would have a different name in the symbol table for some compilers, so we get a new scoped_lock that adds another name to do the same thing. push_back on vector cannot return a reference as does emplace_back for similar reasons too. All of those would leave the source code compatible, but break linking, often silently.
These are some of the ones off the top of my head, there are more. So the fix in these cases is another name, and that has the problem of making the language needlessly more complicated and harder to learn.
So, the author wants something between FORTRAN's Ratfor and F. I can see it as a supplement to the C++ standard but not being the actual standard. It would make people distrust C++ greatly if implemented as the standard.
[+] [-] beefhash|6 years ago|reply
The C++ standards committee can standardize a backwards-incompatible "new" C++ for all I care as long as they don't call it C++.
[+] [-] rumanator|6 years ago|reply
This x2. It's only C++ if you can optionally refactor random lines of code in a project written in compliance with C++98 using new features and still get a valid program.
Once you mess with the language in a way that your C++98 code is either not supported or requires rewrites or redesigns to comply with the new standard then quite obviously it is not C++ anymore.
Semver matters.
[+] [-] Fronzie|6 years ago|reply
So there is a path, but it has to take an upgrade-path into account for big legacy projects. In practice, moving to a new compiler can be a big project for companies with larger code bases. Sensible deprecation of features could (and in practice) is part of that.
Even though the standard is quite careful, in practice big C++ projects sometimes rely on non-conforming behavior of the specific compiler version they use. An example is the MSVC template support which allowed constructs that the standard didn't.
[+] [-] mamcx|6 years ago|reply
In short time, is bad. The more time pass, is good.
The cost of STUPID C/C++/JS behavior is measured in the billons.
Is not the makers of this langs aware? Yes. They know how could be fixed? Sure. Then WHY is not fixed?
Because not considering time in the calculation (and how many MILLONS OF PEOPLE are affect). But today, are DECADES behind of 100% certainty of the cost of this mistakes, and proved beyond doubts that the langs that fixed them ARE better.
In the face of reality and facts, why resist so much?
---
The thing is HOW get out of this mess?
The big irony is that the JS world show how (but not commit with the proper force): Build transpilers. Make "Better C++" that transpile to "BAD C++". Make clear that everyone must move forward, but provide this as partial step.
Make auto converters. Fix damm stupidity like dangling IFs, (seriously some stuff is not brainer). Have a clear vision in how move forward.
And drop the ego. C/c++ not need to turn into Rust, but why believe it could not truly get better?
Now, in the case of C/C++ exist a big trouble in the case of being the "de facto" ABIs for the rest of the world. Apple have the same issue with swift -> ob-c with the case of nullability (which apis never return null even if in theory could?).
Them do annotations to the APIs. This could allow to mechanize the transformations and provide ABI translations that could be injected in the compiler.
For example:
ABI STEP 1 (annotate only):
Old code is assumed to be alike evil.ABI STEP 2 (rewrite):
Then it could autotranslate the calls in demand at compile time, following transformations alike maps.Eventually, when the calls are converted this step is erased and the runtime penalty removed, then all become good.
[+] [-] eternalny1|6 years ago|reply
[+] [-] temac|6 years ago|reply
[+] [-] captaincrowbar|6 years ago|reply
[+] [-] lanevorockz|6 years ago|reply
[+] [-] MrBuddyCasino|6 years ago|reply
[+] [-] m463|6 years ago|reply
[+] [-] coribuci|6 years ago|reply
Well, just FYI the "new" C++ is not the "old" C++ anymore. Languages "evolve". Try to compile an old program on the new compiler. The same is valid for other languages (C, fortran, perl).
> The C++ standards committee can standardize a backwards-incompatible "new" C++ for all I care as long as they don't call it C++.
They already do this. For C for example you have C89, C93, etc.
[+] [-] badsectoracula|6 years ago|reply
Also:
> I think it also makes sense for C++ developers whose projects are relatively short-lived, e.g. some games.
There is a difference between games and engines - the games might be short lived, but most C++ engines are very long lived, going back decades (e.g. see how the recent Half-Life Alyx can trace its codebase back to Quake 1 or pretty much anything based on the Unreal Engine going back to the mid-90s - there are screenshots of UE1 running on what appears to be either Windows 3.1 or NT 3.51). Even when "engines" are being made "from scratch" this is often a marketing gimmick and the "from scratch" was just renaming the engine because it got a new rendering model and people can't tell the difference. EVEN when this isn't case, often new engines are based on existing code and/or libraries.
[+] [-] kllrnohj|6 years ago|reply
While true the engines are also staffed and can absolutely keep up with changes. They regularly are tasked with new toolchains, platforms, OSes, and APIs to leverage already. This would just be more of the same, and if you look at the goals of the paper: http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2020/p213...
That lines up pretty damn well to what game engines want as well.
> One of the main reasons to stick with C++ (and C) is that the investment you put in the code you're writing right now, no matter how fugly it is, will be there and keep working in the future.
I think you should read the actual paper instead. http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2020/p213...
The things they talk about actually abandoning are things like byte sizes other than 8-bits or source code in file systems that don’t support file extensions or nested directories.
Chances are any existing C++ codebase is already not compatible with such systems that they want to more formally deprecate/abandon in the first place.
[+] [-] bruxis|6 years ago|reply
[+] [-] amelius|6 years ago|reply
Except slowly people will be writing parts of it in Rust, or they would be using libraries written in Rust (because availability), and then after a while people get annoyed of using more than one language and the whole project will be rewritten in Rust. Of course, at that point, a new language will be invented and Rust code will be looked at as rusty :)
[+] [-] fctorial|6 years ago|reply
Source?
[+] [-] vbezhenar|6 years ago|reply
[+] [-] pjmlp|6 years ago|reply
[+] [-] lytigas|6 years ago|reply
https://youtu.be/tISy7EJQPzI?t=2209
I remember watching another talk where they propose something similar for the language itself, but I can't find it at the moment.
[+] [-] llpg|6 years ago|reply
Per Hyrum's law, someone will rely on any given behavior. So, if you choose backwards compatibility, it becomes harder for the committee to evolve the language over time.
The path of breaking changes can be made less painful, and does not, necessarily, invalidate all the code that is already written.
With that being said, the problem right now is not the decision of keeping backwards compatibility or not, but the fact that the standard should be explicit about it, so people know what to expect.
[+] [-] kllrnohj|6 years ago|reply
[+] [-] pjmlp|6 years ago|reply
Such a solution is only possible if driven by the standard, otherwise it will never be fully available, just like the static analysis tooling varies by vendor.
[+] [-] vintermann|6 years ago|reply
[+] [-] simias|6 years ago|reply
Maybe you could do like Rust with their "Epoch" system that lets you interoperate code using various standards in order to migrate progressively without breaking backward compatibility. I suspect that it would be a lot harder to make it work for C++ however, mainly due to its extreme reliance on #includes (especially for anything using templates) and more common use of macros.
I'm not saying it's impossible but I suspect that it would fragment the ecosystem quite a bit. Removing "old features" tends to have massive side effects in a language like C++ with metaprograming, overloading, multiple inheritance, unlimited macro usage and complex symbol resolution rules.
So I think "why not just use Rust/D/Zig/Nim/Crystal" is warranted feedback for these proposals (and you could probably add Go, C#, Java and a few others).
[+] [-] hedora|6 years ago|reply
Much of this proposal would create insurmountable engineering problems for everything except the Google monorepo, or similar infrastructure at $1T companies building monoliths.
Historically, most innovation has happened outside of those environments.
Sabotaging the toolchains of smaller shops is not going to pan out well for the industry at large.
[+] [-] google234123|6 years ago|reply
[+] [-] gurkendoktor|6 years ago|reply
C++ 11, 14, 17, 20, and 23 are all binary compatible (modulo some possible oversights). There are many quality-of-life improvements that got voted out because then a binary compiled in C++11 wouldn't be compatible with one built in C++23 anymore.
As a simple example ([1] has many more): the only reason we have both std::scoped_lock and std::lock_guard is because changing the definition of the latter would have required re-compilation of existing code, so a new class had to be introduced.
ABI stability even overrides performance in committee decisions (see the std::regex example, but I also heard it affects the footprint of unique_ptr - edit: found it [2]).
tl;dr breaking source code compatibility seems way off if implementors don't even dare to break binary compatibility.
[1] http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2020/p202...
[+] [-] snovv_crash|6 years ago|reply
[+] [-] jefftk|6 years ago|reply
[+] [-] santamarias|6 years ago|reply
[+] [-] ncmncm|6 years ago|reply
[+] [-] rumanator|6 years ago|reply
I'm currently working on a C++ project that has been taken out of cold storage and when the old rewrite question pops up the tech stacks considered for the job do not feature C++ at all. The main reason is that nowadays there are better tools and frameworks to implement non-performance critical parts of an application such as the GUI, which incidental takes over more than half the total lines of code, and the rest can be implemented easily with small support services developed in whichever tech stack you choose. Thus C++'s "jack of all trades" trait is no longer a major factor in the decision. Meanwhile the lack of a stable ABI starts to feature prominently as a reason not to adopt C++.
[+] [-] glouwbug|6 years ago|reply
Seems like a terrible thing to break this late
[+] [-] Too|6 years ago|reply
[+] [-] AllanHoustonSt|6 years ago|reply
[+] [-] steveklabnik|6 years ago|reply
[+] [-] olafure|6 years ago|reply
[+] [-] hawski|6 years ago|reply
[+] [-] amelius|6 years ago|reply
[+] [-] ComputerGuru|6 years ago|reply
[+] [-] SaxonRobber|6 years ago|reply
[+] [-] beached_whale|6 years ago|reply
These are some of the ones off the top of my head, there are more. So the fix in these cases is another name, and that has the problem of making the language needlessly more complicated and harder to learn.
[+] [-] google234123|6 years ago|reply
[+] [-] protomyth|6 years ago|reply
[+] [-] alpineidyll3|6 years ago|reply
[+] [-] gameswithgo|6 years ago|reply
[+] [-] gigatexal|6 years ago|reply
[+] [-] ImprovedSilence|6 years ago|reply