top | item 19723066

Modern C++ Won't Save Us

326 points| neptvn | 7 years ago |alexgaynor.net | reply

388 comments

order
[+] jandrewrogers|7 years ago|reply
A significant issue I have with C++ is that even if your code base is pure C++17, the standard library is a Frankenstein's monster of legacy and modern C++ mixed together that required many compromises to be made. A standard library that usefully showed off the full capabilities of C++17 in a clean way would have to jettison a fair amount of backward compatibility in modern C++ environments.

I've noticed that more and more people like me have and use large alternative history "standard libraries" that add functionality, reimagine the design, and in some cases reimplement core components based on a modern C++ cleanroom. I've noticed that use of the standard library in code bases is shrinking as result. You can do a lot more with the language if you have a standard library that isn't shackled by its very long history.

[+] near|7 years ago|reply
Because C++ is my primary language, and I always work on my codebases alone, I dropped the standard library and implemented my own replacement. It's not at all practical for most I'm sure, but it allows me to evolve the library with new revisions of the C++ standard without being absolutely fixed on backward compatibility.

One of the things I did for safety is that all access methods of all of my containers will bounds check and throw on null pointer dereferences ... in debug and stable mode. And all of that will be turned off in the optimized release mode, for applications where performance is absolutely critical. The consistency is very important.

Whenever I get a crash in a release mode, I can rebuild in debug mode and quickly find the issue. And for code that must be secure, I leave it in stable mode and pay the small performance penalty.

[+] masklinn|7 years ago|reply
> A significant issue I have with C++ is that even if your code base is pure C++17, the standard library is a Frankenstein's monster of legacy and modern C++ mixed together that required many compromises to be made. A standard library that usefully showed off the full capabilities of C++17 in a clean way would have to jettison a fair amount of backward compatibility in modern C++ environments.

Not to mention C++ does not really provide the facilities necessary for convenient, memory-safe and fast APIs[0].

And as demonstrated by e.g. std::optional the standard will simply offer an API which is convenient, fast and unsafe (namely that you can just deref' an std::optional and it's UB if the optional is empty).

[0] I guess using lambdas hell of a lot more would be an option but that doesn't seem like the committee's style so far.

[+] millstone|7 years ago|reply
I agree with this and would take it a step further, and say that recent changes to the STL are the worst parts of modern C++. For example std::regex supports 6 distinct syntaxes, the PRNG stuff is massively over-engineered, the "extensions for parallelism" add complexity without giving enough knobs for any real perf improvement. Meanwhile there's gaping holes like UTF-8 support. It's a sad state.
[+] colanderman|7 years ago|reply
What parts specifically? By my estimation, the only non-deprecated part of the standard library that really reeks of pre-C++11 (what I believe most consider the advent of "modern") is iostream. Most of e.g. the containers have been kept up to date with new features of the language (e.g. move semantics, constexpr).

The standard library certainly is lacking things which are commonly used (say, JSON parsing or database connection), but I think this is a conscious decision (and IMO the correct decision) to include only elements that have a somewhat settled, "obvious", lowest-common-denominator semantics. There's rhyme and reason to most of the most commonly used elements that is decidedly lacking from e.g. Python's (much more extensive) standard library.

[+] kabdib|7 years ago|reply
I work in a shop where there was a significant effort in a cross-platform library a long time ago, but that old code has been showing cracks and emitting creaks ("Hey, folks, guess how many debugging hours it took to find out that lambdas didn't work here, either"). Use of the standard library is frowned upon except when absolutely necessary, so there's no avoiding the thing. From time to time someone will joust at it and pull a particularly screwball section forward a decade or two, but on the whole the old stuff is just never going away short of a catastrophe. It makes onboarding interesting, and it makes you reflect philosophically on expertise that is valuable absolutely no place else.

I work on other projects, or on my own stuff at home, and I can breathe again. I don't always need reverse iterators on a deque, but dammit they are there if I need them.

However, I have been in too much C runtime code to be entirely happy. I've seen too many super-complicated disasters, for instance the someone who really wanted to write the Great American OS Kernel but who wasn't allowed on the team, and so had to make their bid for greatness in stdio.h instead. You learned to tread carefully in that stuff, the only good news being that if you broke something it might have turned out to be already busted anyway and no harm done, philosophically speaking, I mean.

There are no good answers :-)

[+] fsloth|7 years ago|reply
So the language is evolving, it is used by projects that are old and still in good enough shape so one can adapt their concepts to some new things, and as a sugar on top, it does not break bakcwards compatibility.

As such it just sounds like a mature technology which a huge adopted base and is still holding traction. Generally maturity, traction and adaptability can be considered indicators of health and not malady.

Beauty is overstated. Engineering can be art but it doesn't have to be.

Jokes aside, I use C++ daily and see it as Warty McWartface and could spend a long time ruminating about it's faults. But adapting old stuff to new boundaries is always going to be messy. Generally rewriting history creates more problems than solves them.

[+] fooker|7 years ago|reply
I don't see the problem. You are free to use such a modern library (Google does, it's called absl).

The good thing here is that the standard library doesn't require 'magic' to be implemented (unlike Swift where the standard library relies on hidden language hacks).

[+] hak8or|7 years ago|reply
Can you or others post such alternative standard libraries? The only ones that come to mind are boost (which is a nightmarefor compile times and I feel is a mishmash of old and new) and googles absiel which I haven't actually tried enough to make an opinion about.
[+] lallysingh|7 years ago|reply
What legacy? It's not like there was a single "before time." There are problems coming up with all of it, because the underlying runtime model provides too few guarantees. We'll be plugging holes the rest of our natural lives.
[+] namirez|7 years ago|reply
This has been discussed extensively in the C++ community. I think if you need a very safe code, you shouldn't use the string_view or span without thinking about the potential consequences. These are added to the language to prevent memory allocation and data copy for performance critical software.

Herb Sutter has concrete proposals to address this issue and Clang already supports them: https://www.infoworld.com/article/3307522/revised-proposal-c...

[+] pjmlp|7 years ago|reply
It is true, C++ has several warts some of them caused by the copy-paste compatibility with C.

Which is both a blessing and a curse. A blessing as it allowed us Pascal/Ada/Modula refugees never to deal with what was already outdated, unsafe language by the early 90's.

But also makes it relatively hard to write safe code when we cannot prevent team members, or third party libraries, to use Cisms on their code.

Regarding the alternatives, Swift is definitly not an option outside Apple platforms. And even there, Apple still focus on C++ for IO Kit, Metal and LLVM based tooling.

Rust, yes. Some day it might be, specially now with Google, Microsoft, Amazon, Dropbox,... adopting it across their stacks.

However for many of us it still doesn't cover the use cases we use C++ for, so it is not like I will impose myself, the team and customers, a productivity pain, take the double amount of time that it takes to write a COM component or native bindings in C++ for .NET consumption just to feel good.

When we get Visual Rust, with mixed mode debugging, Blend integration and a COM/UWP language projection for Rust, then yeah.

[+] masklinn|7 years ago|reply
> It is true, C++ has several warts some of them caused by the copy-paste compatibility with C.

I mean that's a bit of a cop-out given C++ has more non-C warts and UBs than it has C warts and UBs at this point. It's not just "copy-paste compatibility with C" which made std::unique_ptr or std::optional deref and UB.

[+] Animats|7 years ago|reply
The C++ people are trying to refit ownership to the language without adding a borrow checker. This is painful. They've made it possible to write code that expresses ownership, but they can't catch all the places where the abstraction leaks.

string_view is really a non-mutable borrow. But the compiler does not know this.

[+] raphlinus|7 years ago|reply
From the article:

> Dereferencing a nullptr gives a segfault (which is not a security issue, except in older kernels).

I know a lot of people make that assumption, and compilers used to work that way pretty reliably, but I'm pretty confident it's not true. With undefined behavior, anything is possible.

[+] _wmd|7 years ago|reply
Linux hit a related situation: a harmless null pointer dereference was treated by GCC as a signal that a subsequent isnull test could not be true, causing the test to be optimized away. https://lwn.net/Articles/575563/
[+] kccqzy|7 years ago|reply
Absolutely. In many experience if clang can deduce a function will definitely trigger UB such as definitely dereferencing a null pointer, it generally optimizes the entire function after the reference into a single ud2 instruction. (Which raises the #UD exception in the CPU).

This is something really hardwired into the C and C++ language. Even if the underlying operating system perfectly supports dereferencing null pointers, compilers will always treat them as undefined behavior. (In Linux root can mmap a page of memory at address 0, and certain linker options can cause the linker to place the text section starting at address 0 as well.)

[+] tedunangst|7 years ago|reply
The irony is it's mostly unsafe if you test for the null, such that the compiler can omit a test, but if there's no evidence the pointer can be null you just get a normal memory access. The optimizer is not optimized for most intuitive behavior.
[+] kevin_thibedeau|7 years ago|reply
Definitely not true. Consider an IoT device without an MMU.
[+] jclay|7 years ago|reply
I really don't get all the hate that C++ gets. The suggested alternatives in the article are Rust and Swift. What if you need to develop a cross platform GUI, that has a backend running a CUDA or OpenCL algorithm? For the former, you can use Qt, which isn't without it's warts, but is pretty tried and true in my experience (see KDE, VTK, etc). For the latter, you'll end up writing your CUDA code in C++ anyways. I guess you could go the route of writing bindings, but that is not without additional effort. Not that it won't happen for Rust, but C++ also has tooling suited for enterprise use that are largely unmatched in other languages (Visual Studio, Qt, etc). Sandboxing, static analysis, and fuzzing tools are also mostly built for C/C++ codebases. It's also an ISO standard language which makes it both a language full of warts due to decision by committee, but also a candidate for a stable, long-lasting language that will outlive many of us. (Try finding an ISO Standard Language you don't hate).

Either way, C++ is certainly not for every project, but the articles scattered around the web claiming it should be superseded by Rust are plentiful. These opinion pieces make no attempt to credit C++ for when it does make sense to use. Despite it's quirks, it is still the most optimal way to program HPC applications or cross platform GUIs that are not Electron based. The security tools around it and the fact that it's an ISO standard language make it a solid choice for many enterprises.

[+] mannykannot|7 years ago|reply
I do not think it helps to think in emotional terms such as 'hate'. There is nothing wrong with discussing potential problems, and the current utility of the language should not stop us asking whether we could do better in future.

FWIW, I use C++, not Rust or Swift, and I have a fair amount of knowledge and experience vested in it, but I think these questions are worth asking.

[+] badsectoracula|7 years ago|reply
C++ does have its positives, as you mentioned, but those positives do not make its negatives go away, nor having negatives means that there aren't positives. You can dislike some parts of the language while still using it for its positive aspect - that doesn't mean the negative parts do not exist nor mentioning them means that there are no positives.
[+] magila|7 years ago|reply
While there are obviously still cases where C++ makes sense to use today, those case are overwhelmingly based on the age and maturity of the C++ ecosystem. Now that Rust has proven that a language can provide memory safety without compromising (much) on performance, it is clear that the scope of C++'s supremacy is in permanent decline.

As Rust (or another language with similar safety/performance properties) matures and its ecosystem grows, C++ will increasingly become a language of tiny niches and legacy codebases.

In other words: C++ is the new Fortran.

[+] 932|7 years ago|reply
Yeah, agreed. The points in the article are valid, but quirks you learn and get past the first time. I still shoot myself in the foot sometimes even though I don't have a single bare new/malloc without a shared/unique ptr! But that's C++ for you.

But, C/C++ is the best option for us for high-performance network processing. We're dabbling with Rust for small applications where we would use Python previously and it's working pretty well -- but there's no way we could use Rust for the core application yet. Modern C++ has really grown on me and it's sometimes a love/hate relationship but totally a huge improvement over ancient C++ or C.

[+] bobajeff|7 years ago|reply
>Not that it won't happen for Rust, but C++ also has tooling suited for enterprise use that are largely unmatched in other languages

Hopefully that stuff will be helped with things like Language Server Protocol and Debug Adapter Protocol.

[+] 0xDEEPFAC|7 years ago|reply
What do you need saving from - Ada has existed for nearly 30 years now ; )
[+] ajxs|7 years ago|reply
I came here to the comments to post this exact thing, haha. I'm very late to the Ada party, and I'm amazed at how ahead of its time this language was. It's still very usable and modern by today's standards.
[+] jasonhansel|7 years ago|reply
Can someone at least make a linter that ensures you only use a "safe" subset of C++?
[+] pfultz2|7 years ago|reply
Clang's lifetime profile will catch the first example:

    <source>:8:16: warning: passing a dangling pointer as argument [-Wlifetime]
      std::cout << sv;
                   ^

    <source>:7:38: note: temporary was destroyed at the end of the full expression
      std::string_view sv = s + "World\n";
                                         ^
And cppcheck will catch the second example:

    <source>:7:12: warning: Returning lambda that captures local variable 'x' that will be invalid when returning. [returnDanglingLifetime]
        return [&]() { return *x; };
               ^
    <source>:7:28: note: Lambda captures variable by reference here.
        return [&]() { return *x; };
                               ^
    <source>:6:49: note: Variable created here.
    std::function<int(void)> f(std::shared_ptr<int> x) {
                                                    ^
    <source>:7:12: note: Returning lambda that captures local variable 'x' that will be invalid when returning.
        return [&]() { return *x; };
               ^
Cppcheck could probably catch all the examples, but it needs to be updated to understand the newer classes in C++.
[+] safercplusplus|7 years ago|reply
Others disagree, but I suggest that someone could make such a linter. As others point out, the Core Guidelines "lifetime profile checker"[1] is designed to be an advanced static analyzer that restricts how many C++ elements can be used. It's not finished yet, and the current version is not designed to achieve complete memory safety (and doesn't address data race safety). Whether or not subsequent versions could match the full safety enforcement of Rust's compiler seems to be a matter of some debate.

But there is an alternative/complementary approach, which is to simply avoid potentially unsafe C++ elements, like pointers/references, arrays, std::string_views, std::threads, etc., substituting them with safe, largely compatible replacements[2]. This approach has the benefit that an associated safety-enforcing "linter" would not impose the same kinds of "severe" usage restrictions that the lifetime profile checker (or, say, the Rust compiler) does.

[1] https://devblogs.microsoft.com/cppblog/lifetime-profile-upda...

[2] https://github.com/duneroadrunner/SaferCPlusPlus

edit: grammar

[+] steveklabnik|7 years ago|reply
The Core Guidelines are an attempt at this, but it’s not fully safe. Safer, which matters! But not safe.

There isn’t really any useful safe subset of C++. If there were, Rust may never have been created in the first place.

[+] 0815test|7 years ago|reply
The answer is essentially no, at least if you're seeking substantial levels of assurance or safety. Even the C++ Core Guidelines effort, https://github.com/isocpp/CppCoreGuidelines which is the closest thing to what you describe and is driven by influential members of the ISO C++ community including B. Stroustrup, does not claim that they'll be able to make C++ memory safe.
[+] User23|7 years ago|reply
For systems programming languages, safe by default with scoped unsafe code is a Pareto improvement on unsafe everywhere.
[+] jmole|7 years ago|reply
Question - How does one write microcontroller code (or other memory-mapped I/O code) using a memory-safe language?
[+] systemBuilder|7 years ago|reply
Many of the problems he talks about come from the lunacy of the C++ compiler making all sorts of temporaries and calling hidden type conversion functions, making all sorts of assumptions that it should never ever ever make without being told by the programmer. That is why C will always be a better language than c++ on a fundamental level. In this area stroustrup took C in a bad direction.
[+] nitwit005|7 years ago|reply
The string_view issue has popped up even in relatively safe languages. Java's String class used to do something similar, where substring returned a String that referenced the original String object's internal array to avoid a copy. They gave up on it because too many people accidentally held a references to large strings and leaked memory that way.
[+] 0xe2-0x9a-0x9b|7 years ago|reply
There is no call in the article for a deeper C++ code analysis by the compiler. Deeper analysis will be the future of C++ - the article fails to foresee this.
[+] shmerl|7 years ago|reply
> Nonetheless, the question simply must be how we can accomplish it, rather than if we should try. Even with the most modern C++ idioms available, the evidence is clear that, at scale, it's simply not possible to hold C++ right.

So, how then? That's the main question indeed :)

[+] sys_64738|7 years ago|reply
Today's C++ will be considered a cobbled together relic in a few C++ standards time periods!
[+] ncmncm|7 years ago|reply
By then Rust will also seem a cobbled-together relic, and you will be chasing the new hotness. In the meantime, we are writing the code that makes the world work. In C++.

By then, many will also be writing it in Rust, and you will be sneering at them, too. It has always bern easy to sneer at people busy making things work.

[+] MiroF|7 years ago|reply
The stringview example is surprising and certainly something I could have fallen for.

I feel like the lambda example is pretty contrived. If I was returning a lambda that was capturing values by reference, I would already be pretty wary of UB.

[+] umanwizard|7 years ago|reply
I guess it comes down to the individual reader. I had to look at the lambda example several times to realize what the problem was with it. I guess my eyes just skim over the capture section unless I have some good reason to look at it. The string_view example, on the other hand, was immediately obviously wrong to me.
[+] IshKebab|7 years ago|reply
> Dereferencing a nullopt however, gives you an uninitialized value as a pointer, which can be a serious security issue.

Is this really true? Surely it just gives you an uninitialised `int` (or whatever is in the `optional`)?

[+] leshow|7 years ago|reply
Rust and Swift have different definitions of memory safety, don't they?
[+] 0815test|7 years ago|reply
Yes, AIUI Swift does not ensure memory safety for concurrent code like Rust does. You have to expressly opt-in to concurrency-safety, and it's not checked by the compiler. Go definitely has this issue, which is admittedly bizarre for a language that's so often used to code network-oriented services making heavy use of concurrency.
[+] sayusasugi|7 years ago|reply
Any HN post mentioning C++ will inevitably be invaded by the Rust Evangelism Strikeforce.
[+] wutbrodo|7 years ago|reply
You picked a bizarre article to make that comment on...It's hardly irrelevant to the post, as the author's central thesis is that there's no case for choosing C++ over languages like Rust and Swift (he says as much in the article).
[+] insulanian|7 years ago|reply
And rightly so! What's wrong with spreading awareness about safer alternative? If that wasn't the case in the past, we'd still be programming in Cobol and Fortran.
[+] throwupaway123|7 years ago|reply
The Rust Evangelism Strikeforce only exists in completely inane comments like yours, maybe instead of posting memes you comment on the actual content of the article not the headline?