(no title)
keldaris | 1 year ago
In reality, if you're writing C in 2025, you have a finite set of specific target platforms and a finite set of compilers you care about. Those are what matter. Whether my code is robust with respect to some 80s hardware that did weird things with integers, I have no idea and really couldn't care less.
msla|1 year ago
Because I want the next version of the compiler to agree with me about what my code means.
The standard is an agreement: If you write code which conforms to it, the compiler will agree with you about what it means and not, say, optimize your important conditionals away because some "Can't Happen" optimization was triggered and the "dead" code got removed. This gets rather important as compilers get better about optimization.
uecker|1 year ago
Still, while I acknowledge that this is a real issue, in practice I find my C code from 30 years ago still working.
It is also a bit the fault of users. Why favor so many user the most aggressive optimizing compilers? Every user filing bugs or complaining about aggressive optimizing breaking code in the bug tracker, very user asking for better warnings, would help us a lot pushing back on this. But if users prefer compiler A over compiler B when you a 1% improvement in some irrelevant benchmark, it is difficult to argue that this is not exactly what they want.
keldaris|1 year ago
In my experience, if you don't try to be excessively clever and just write straightforward C code, these issues almost never arise. Instead of wasting my time on the standard, I'd rather spend it validating the compilers I support and making sure my code works in the real world, not the one inhabited by the abstract machine of ISO C.