(no title)
withzombies | 3 months ago
It's the type of dog fooding they should be doing! It's one reason why people care so much about self-hosted compilers, it's a demonstration of maturity of the language/compiler.
withzombies | 3 months ago
It's the type of dog fooding they should be doing! It's one reason why people care so much about self-hosted compilers, it's a demonstration of maturity of the language/compiler.
cogman10|3 months ago
So you can never be perfectly bleeding edge as it'd keep you from being able to build your compiler with an older compiler that doesn't support those bleeding edge features.
Imagine, for example, that you are debian and you want to prep for the next stable version. It's reasonable that for the next release you'd bootstrap with the prior releases toolset. That allows you to have a stable starting point.
stabbles|3 months ago
rmu09|3 months ago
kstrauser|3 months ago
A compiler is perfectly capable of compiling programs which use features that its own source does not.
unclad5968|3 months ago
https://en.cppreference.com/w/cpp/compiler_support/20.html
withzombies|3 months ago
A good example is the C++11 standard garbage collection! It was explicitly optional but afiak no one implemented it.
https://isocpp.org/wiki/faq/cpp11-library#gc-abi
andsoitis|3 months ago
C++ standards support and why C++23 and C++26 are not the default: https://gcc.gnu.org/projects/cxx-status.html
ajross|3 months ago
Backwards compatibility. Not all legal old syntax is necessarily legal new syntax[1], so there is the possibility that perfectly valid C++11 code exists in the wild that won't build with a new gcc.
[1] The big one is obviously new keywords[2]. In older C++, it's legal to have a variable named "requires" or "consteval", and now it's not. Obviously these aren't huge problems, but compatibility is important for legacy code, and there is a lot of legacy C++.
[2] Something where C++ and C standards writers have diverged in philosophy. C++ makes breaking changes all the time, where C really doesn't (new keywords are added in an underscored namespace and you have to use new headers to expose them with the official syntax). You can build a 1978 K&R program with "cc" at the command line of a freshly installed Debian Unstable in 2025 and it works[3], which is pretty amazing.
[3] Well, as long as it worked on a VAX. PDP-11 code is obviously likely to break due to word size issues.
tcfhgj|3 months ago
menaerus|3 months ago
Please don't spread misinformation. Breaking changes are actually almost inexistent with C++. The last one was with the COW std::string and std::list ~15 years ago with the big and major switch from C++03 to C++11. And heck, even then GCC wouldn't let your code break because it supported dual ABIs - you could mix C++03 and C++11 code and link them together.
So C++ actually tries really hard _not_ to break your code, and that is the philosophy behind a language adhering to something that is called backwards-compatibility, you know? Something many, such as Google, were opposing to and left the committee/language for that reason. I thank the C++ language for that.
Introducing new features or new keywords or making stricter implementation of existing ones, such as narrowing integral conversions, is not a breaking change.
dagmx|3 months ago
The issue with defaults is that people have projects that implicitly expect the default to be static.
So when the default changes, many projects break. This is maybe fine if it’s your own project but when it’s a few dependencies deep, it becomes more of an issue to fix.
reactordev|3 months ago
dietr1ch|3 months ago
bluGill|3 months ago
nomel|3 months ago
MichaelZuo|3 months ago
1718627440|3 months ago
They are discussing in this email thread whether it is already properly supported.
> It's one reason why people care so much about self-hosted compilers
For self-hosting and bootstrapping you want the compiler to be compilable with an old version as possible.
BeetleB|3 months ago
"Properly supported" is the key here. Does GCC currently properly support C++23, for example? When I checked a few months ago, it didn't.
jcelerier|3 months ago
binary132|3 months ago
withzombies|3 months ago
Warnings becoming errors would be scoped to gcc itself only, and they can fix them as part of the upgrade.
burnt-resistor|3 months ago
hulitu|3 months ago
cursing because the old program does not compile anymore No.
withzombies|3 months ago
superkuh|3 months ago
I love that C++ has a long enough time between changing targets to actually be useful and that it's culture is about stability and usefulness for users trying to compile things rather than just dev-side improvements uber alles.
surajrmal|3 months ago
mustache_kimono|3 months ago
This is nonsense. Apt devs can target a rustc release and that release can be the same release that ships with Debian? Moreover, since those apt devs may have some say in the matter, they can choose to update the compiler in Debian!
> The entire language culture is built around this rapid improvement.
... Because this is a cultural argument about how some people really enjoy having their codebase be 6 years behind the latest language standard, not about any actual practical problem.
And I can understand how someone may not be eager to learn C++20's concepts or to add them immediately to a code base, but upgrades to your minimum Rust version don't really feel like that. It's much more like "Wow that's a nifty feature, I immediately understand and I'd like to use in the std lib. That's a great alternative to [much more complex thing...]" See, for example, OnceLock added at 1.70.0: https://doc.rust-lang.org/std/sync/struct.OnceLock.html