I recently started a pet project using modules in MSVC, the compiler that at present has best support for modules, and ran into a compiler bug where it didn't know how to compile and asked me to "change the code around this line".
So no, modules aren't even here, let alone to stay.
Never mind using modules in an actual project when I could repro a bug so easily. The people preaching modules must not be using them seriously, or otherwise I simply do not understand what weed they are smoking. I would very much appreciate to stand corrected, however.
I still hope that modules become mature and safe for production code. Initially I coded in C/C++ and this header #include/#ifndef approach seemed OK at that time. But after using other programming languages, this approach started to feel too boilerplate and archaic. No sane programming language should require a duplication in order to export something (for example, the full function and its prototype), you should write something once and easily export.
Modules are still in the early adoptor phase - despite 3 years. there are unfortunately bugs, and we still need people to write the "best practices for C++ modules" books. Everyone who has use them overall says they are good things and worth learning, but there is a lot about using them well that we haven't figured out.
I'm afraid things will continue very much sucking for a long time and will still be less-than even when they become broadly supported since sepples programmers, being real programmers™, are not entitled to have nice things.
I don't think you're missing something. The standards committee made a bad call with "no submodules", ran into insurmountable problems, and doubled down on the bad call via partitions.
"Just one more level bro, I swear. One more".
I fully expect to sooner or later see a retcon on why really, two is the right number.
Yeah, I'm salty about this. "Submodules encourage dependency messes" is just trying to fix substandard engineering across many teams via enforcement of somewhat arbitrary rules. That has never worked in the history of programming. "The determined Real Programmer can write FORTRAN programs in any language" is still true.
From the outside looking in, this all feels like too little too late. Big tech has decided on Rust for future infrastructure projects. C++ will get QoL improvements… one day and the committees seem unable to keep everyone happy or disappoint one stake holder. C++ will be around forever, but will it be primarily legacy?
Yes. Unfortunately the committee has completely abandoned safety at this point. Even memory/thread safety profiles have been indefinitely postponed. The latest ghost safety lifetimes thing is completely unimplementable
There literally isn't a plan or direction in place to add any way to compete with Rust in the safety space currently. They've got maybe until c++29 to standardise lifetimes, and then C++ will transition to a legacy language
> Big tech has decided on Rust for future infrastructure projects. C++ will get QoL improvements…
when people say this do they have like any perspective? there are probably more cpp projects started in one week (in big tech) than rust projects in a whole year. case in point: at my FAANG we have probably like O(10) rust projects and hundreds of cpp projects.
import std; is an order of magnitude faster than using the STL individually, if that's evidence enough for you. It's faster than #include <iostream> alone.
Chuanqi says "The data I have obtained from practice ranges from 25% to 45%, excluding the build time of third-party libraries, including the standard library."[1]
If you tools are not updated that isn't the fault of C++. You will feel the same about Rust when forced to used a 15 year old version too (as I write this Rust 1.0 is only 10 years old). Don't whine to me about these problems, whine to your vendors until they give you the new stuff.
I get by without modules or header files in my C++ projects by using the following guidelines:
- Single translation unit (main.cpp)
- Include all other cpp files in main
- Include files in dependency order (no forward declarations)
- No circular dependencies between files
- Each file has its own namespace (e.g. namespace draw in draw.cpp)
This works well for small to medium sized projects (on the order of 10k lines). I suspect it will scale to 100k-1M line projects as long as there is minimal use of features that kill compile times (e.g. templates).
Macro hygiene, static initialization ordering, control over symbol export (no more detail namespaces), slightly higher ceiling for compile-time and optimization performance.
If these aren't compelling, there's no real reason.
modules are the future and the rules for are well thought out. Ever compiler has their own version of PCH and they all work different in annoying ways.
I am curious to know if that 8.6x speedup is consistent.
I don't see many "fair" benchmarks about this, but I guess it is probably difficult to properly benchmarks module compilation as it can depend on cases.
If modules can reach that sort of speedup consistently, it's obviously great news.
I think that SFINAE and, to a lesser extent, concepts is fundamentally a bit odd when multiple translation units are involved, but otherwise I don’t see the problem.
It’s regrettable that the question of whether a type meets the requirements to call some overload or to branch in a particular if constexpr expression, etc, can depend on what else is in scope.
In my opinion this syntax is super good, it allows to have all functions/method names starting at the same level, it’s way easier to read the code that way, huge readability improvement imo. Sadly nobody uses this and you still have the classic way so multiple ways to do the same thing…
yunnpp|1 month ago
So no, modules aren't even here, let alone to stay.
Never mind using modules in an actual project when I could repro a bug so easily. The people preaching modules must not be using them seriously, or otherwise I simply do not understand what weed they are smoking. I would very much appreciate to stand corrected, however.
senfiaj|1 month ago
bluGill|1 month ago
vitaut|1 month ago
malfmalf|1 month ago
https://devblogs.microsoft.com/cppblog/integrating-c-header-...
throw_sepples|1 month ago
reactjs_|1 month ago
pdpi|1 month ago
pjmlp|1 month ago
Rust, Modula-2 and Ada are probably the only ones with module nesting.
groby_b|1 month ago
"Just one more level bro, I swear. One more".
I fully expect to sooner or later see a retcon on why really, two is the right number.
Yeah, I'm salty about this. "Submodules encourage dependency messes" is just trying to fix substandard engineering across many teams via enforcement of somewhat arbitrary rules. That has never worked in the history of programming. "The determined Real Programmer can write FORTRAN programs in any language" is still true.
rienbdj|1 month ago
20k|1 month ago
There literally isn't a plan or direction in place to add any way to compete with Rust in the safety space currently. They've got maybe until c++29 to standardise lifetimes, and then C++ will transition to a legacy language
mathisfun123|1 month ago
when people say this do they have like any perspective? there are probably more cpp projects started in one week (in big tech) than rust projects in a whole year. case in point: at my FAANG we have probably like O(10) rust projects and hundreds of cpp projects.
ofrzeta|1 month ago
as they say "citation needed"
w4rh4wk5|1 month ago
srcreigh|1 month ago
Log scale: Less than 3% done, but it looks like over 50%.
Estimated completion date: 10 March 2195
It would be less funny if they used an exponential model for the completion date to match the log scale.
cmovq|1 month ago
nickelpro|1 month ago
Chuanqi says "The data I have obtained from practice ranges from 25% to 45%, excluding the build time of third-party libraries, including the standard library."[1]
[1]: https://chuanqixu9.github.io/c++/2025/08/14/C++20-Modules.en...
vitaut|1 month ago
TimorousBestie|1 month ago
It seems likely I’ll have to move away from C++, or perhaps more accurately it’s moving away from me.
bluGill|1 month ago
direwolf20|1 month ago
But you might not be able to use libraries that insist upon modules. There won't be many until modules are widespread.
maccard|1 month ago
fasterik|1 month ago
- Single translation unit (main.cpp)
- Include all other cpp files in main
- Include files in dependency order (no forward declarations)
- No circular dependencies between files
- Each file has its own namespace (e.g. namespace draw in draw.cpp)
This works well for small to medium sized projects (on the order of 10k lines). I suspect it will scale to 100k-1M line projects as long as there is minimal use of features that kill compile times (e.g. templates).
zabzonk|1 month ago
indil|1 month ago
Night_Thastus|1 month ago
feelamee|1 month ago
nickelpro|1 month ago
If these aren't compelling, there's no real reason.
WalterBright|1 month ago
bluGill|1 month ago
jokoon|1 month ago
I don't see many "fair" benchmarks about this, but I guess it is probably difficult to properly benchmarks module compilation as it can depend on cases.
If modules can reach that sort of speedup consistently, it's obviously great news.
fooker|1 month ago
The current solution chosen by compilers is to basically have a copy of your code for every dependency that wants to specialize something.
For template heavy code, this is a combinatorial explosion.
WalterBright|1 month ago
amluto|1 month ago
It’s regrettable that the question of whether a type meets the requirements to call some overload or to branch in a particular if constexpr expression, etc, can depend on what else is in scope.
direwolf20|1 month ago
pjmlp|1 month ago
up2isomorphism|1 month ago
direwolf20|1 month ago
whobre|1 month ago
Dude…
cocoto|1 month ago
sethops1|1 month ago
rovingeye|1 month ago
CamperBob2|1 month ago
webdevver|1 month ago
few|1 month ago
on_the_train|1 month ago
GrowingSideways|1 month ago
[deleted]