(no title)
Ontonator | 1 year ago
GCC leaves the print there because it must. While undefined behaviour famously can time travel, that’s only if it would actually have occurred in the first place. If the print blocks indefinitely then that division will never execute, and GCC must compile a binary that behaves correctly in that case.
Sesse__|1 year ago
ynik|1 year ago
In theory, the compiler could know that since `printf` is a well-known standard function. In practice, `printf` might even exit the program via SIGPIPE, so I don't think any compiler will assume that it definitely will return.
Ontonator|1 year ago
uecker|1 year ago
Ontonator|1 year ago
nlewycky|1 year ago
Is `printf` allowed to loop infinitely? Its behaviour is defined in the language standard and GCC does recognize it as not being a user-defined function.
Ontonator|1 year ago
nayuki|1 year ago
Division by zero is undefined behavior. The compiler can assume that it will not happen.
If the divisor is not zero, then the calculation has no side effects. The compiler may reorder the division above the print, because it would have no observable difference in behavior. This could be useful because division has a high latency, so it pays to start the operation as soon as the operand values are known.
If the divisor is zero, the UB says that there is no requirement on how it's compiled, so reordering the division above the print is legal.
gpderetta|1 year ago
If you replace 'if(div)' with an opaque function call, that doesn't change anything as the function might never exit the program, never return, long jump or return via an exception.
asdfaoeu|1 year ago
LoganDark|1 year ago
I've always been told that the presence of UB in any execution path renders invalid all possible execution paths. That is, your entire program is invalid once UB exists, even if the UB is not executed at runtime.
Are you saying this isn't quite true?
ynik|1 year ago
If you do `5 / argc`, that's only undefined behavior if your program is called without any arguments; if there are arguments then the behavior is well defined.
Instead, the presence of UB in the execution path that is actually taken, renders invalid the whole execution path (including whatever happens "before" the UB). That is, an execution path has either defined or undefined behavior, it cannot be "defined up to point-in-time T". But other execution paths are independent.
Thus, UB can "time-travel", but only if it would also have occurred without time travel. It must be caused by something happening at runtime in the program on the time-travel-free theoretical abstract machine; it cannot be its own cause (no time travel paradoxes).
So the "time-travel" explanation sounds a lot more scary than it actually is.
masklinn|1 year ago
It is not. The presence of UB in an execution path renders that execution path invalid. UBs are behaviours, essentially partial functions which are allowed to arbitrarily corrupt program state rather than error.
However "that execution path" can be extensive in the face of aggressive advanced optimisations.
The "time travel" issue is generally that the compiler can prove some paths can't be valid (they always lead to UB), so trims out those paths entirely, possibly leaving just a poison (a crash).
Thus although the undefined behaviour which causes the crash "should" occur after an observable side-effect, because the program is considered corrupt from the point where it will inevitably encounter an UB the side-effect gets suppressed, and it looks like the program executes non linearly (because the error condition which follows the side effect triggers before the side effect executes).
protomolecule|1 year ago