top | item 47044779

(no title)

Doxin | 13 days ago

No, because the former definition is still something you can rely on given a specific compiler and a specific machine. Hell a bunch of UB was pretty much universal anyway. Compilers would usually still emit sensible code for UB.

UB just ment "the spec doesn't define what happens". It didn't use to mean "the compiler can just decide to do any wild thing if your program touches UB anywhere at anytime". Hell, with the modern definition UB can aparantly time travel. you don't even need to execute UB code for it to start doing weird shit in some cases.

UB went from "whatever happens when your compiler/hardware runs this is what happens" to "Once a program contains UB the compiler doesn't need to conform to the rest of the spec anymore."

discuss

order

kace91|12 days ago

>the former definition is still something you can rely on given a specific compiler and a specific machine.

>UB just ment "the spec doesn't define what happens"

What comes to mind is that then the written code is operating on a subspec, one that is probably undocumented and maybe even unintended by the specifics of that version and platform.

It sounds like it could create a ton of issues, from code that can’t be ported to difficulty in other person grokking the undocumented behavior that is being used.

In this regard, as someone that could potentially inherit this code I’d actually want the compiler to stop this potential behavior. Am I missing something? Is the spec not functional enough on its own to rely just on that?

xscott|12 days ago

Very simple code is UB:

    int handle_untrusted_numbers(int a, int b) {
        if (a < 0) return ERROR_EXPECTED_NON_NEGATIVE;
        if (b < 0) return ERROR_EXPECTED_NON_NEGATIVE;
        int sum = a + b;
        if (sum < 0) {
            return ERROR_INTEGER_OVERFLOW;
        }
        return do_something_important_with(sum);
    }
Every computer you will ever use has two's complement for signed integers, and the standard recently recognized and codified this fact. However, the UB fanatics (heretics) insisted that not allowing signed overflow is an important opportunity for optimizations, so that last if-statement can be deleted by the compiler and your code quietly doesn't check for overflow any more.

There are plenty more examples, but I think this is one of the simplest.

Doxin|6 days ago

I'm not opposed to compilers erroring out on UB. But that's not what happens. Instead of choosing to either proceed and hope all is well, or choosing to stop and error out, compilers instead take the secret third option of breaking your code even more and telling no one.

pseudohadamard|11 days ago

One thing you need to add is that UB can be incredibly subtle and almost impossible to spot even by people with decades of programming experience. However, the compiler - and we're talking almost exclusively gcc here - will spot it and silently break your code. It won't warn "hey, I've spotted UB here!" even with every possible warning enabled, it will just quietly break your code without giving you any indication that it's done so.

It's some of the most user-hostile behavior I've ever encountered in an application.