(no title)
sillycross | 4 years ago
It's actually an interesting example where undefined behavior allowed compiler optimization:
(1) dereferencing an invalid pointer is UD
(2) signed integer overflow is UD.
This allows the compiler to assume that the program never crashes and the counter never overflows. The loop is then optimized out knowing that it is read-only thus has no side-effects.
Noughmad|4 years ago
That is literally reason why any behavior is considered undefined. So that the compiler can skip checks to produce better optimized code.
saagarjha|4 years ago
secondcoming|4 years ago
joosters|4 years ago
EDIT: To make it clearer, you could imagine an alternate version of C that aborted the program if an integer overflowed. Then there would be no undefined behavior at all - but the optimization is still possible. It's not the UB that helps us here, it's the language spec telling us what behavior is reliable and what is not.
IshKebab|4 years ago
The optimisation wouldn't be possible in that case because then the program wouldn't abort when the integer overflowed. It would break the defined behaviour that overflow=abort.
minitech|4 years ago
But then an optimization could change the defined behavior of aborting to not aborting, which is essentially what undefined behavior means and really bad if you don’t treat it exactly like undefined behavior.
adrian_b|4 years ago
For example, with gcc you can use either the option "-ftrapv" or the better option "-fsanitize=undefined -fsanitize-undefined-trap-on-error" and the program will abort on integer overflows (with the second option it will also abort for many other error conditions, e.g. access out of bounds).
em3rgent0rdr|4 years ago