top | item 29010973

(no title)

sillycross | 4 years ago

> Amazingly, we find that the GCC compilers is able to compile Travis’ is_empty C++ function to constant-time code.

It's actually an interesting example where undefined behavior allowed compiler optimization:

(1) dereferencing an invalid pointer is UD

(2) signed integer overflow is UD.

This allows the compiler to assume that the program never crashes and the counter never overflows. The loop is then optimized out knowing that it is read-only thus has no side-effects.

discuss

order

Noughmad|4 years ago

> It's actually an interesting example where undefined behavior allowed compiler optimization

That is literally reason why any behavior is considered undefined. So that the compiler can skip checks to produce better optimized code.

saagarjha|4 years ago

Actually, many instances of undefined behavior are born out of a desire for portability. Some processors will fault on both of those and the standard wants to be inclusive of those behaviors. Compilers happen to be able to use the definition of undefined behavior to also optimize code, but the concept is born out of practicality.

secondcoming|4 years ago

And so you get an optimal, but broken, program. This helps nobody.

joosters|4 years ago

Maybe I'm splitting hairs, but it's not specifically the presence of 'undefined behavior' that allows the compiler optimization. Instead, it's the language specification. The C spec says that integers cannot be relied upon to overflow. The result of this is that compilers are then free to assume that the program they are compiling has NO undefined behavior in it, and so the optimization is possible.

EDIT: To make it clearer, you could imagine an alternate version of C that aborted the program if an integer overflowed. Then there would be no undefined behavior at all - but the optimization is still possible. It's not the UB that helps us here, it's the language spec telling us what behavior is reliable and what is not.

IshKebab|4 years ago

> To make it clearer, you could imagine an alternate version of C that aborted the program if an integer overflowed. Then there would be no undefined behavior at all - but the optimization is still possible.

The optimisation wouldn't be possible in that case because then the program wouldn't abort when the integer overflowed. It would break the defined behaviour that overflow=abort.

minitech|4 years ago

> Then there would be no undefined behavior at all - but the optimization is still possible.

But then an optimization could change the defined behavior of aborting to not aborting, which is essentially what undefined behavior means and really bad if you don’t treat it exactly like undefined behavior.

adrian_b|4 years ago

You need not imagine an alternate version of C, such a version of C is provided by any decent C compiler.

For example, with gcc you can use either the option "-ftrapv" or the better option "-fsanitize=undefined -fsanitize-undefined-trap-on-error" and the program will abort on integer overflows (with the second option it will also abort for many other error conditions, e.g. access out of bounds).

em3rgent0rdr|4 years ago

I suppose it is a safe assumption that the counter will never overlow if memory space is not large enough to possibly hold the maximum positive integer number of data structs. Though if say was using a smaller int for size than what the hypothetical size that virtual memory could handle, such as a short int in a 32-bit memory system, then that assumption may not be true. But on a 64-bit linux system which only allows up to 128 TiB of virtual address space for individual processes, a 64-bit signed int could be as large as 2^63, which would be larger than the hypothetical maximum size that a 128 TiB virtual memory could reference, so the assumption that the size counter could never overflow would be safe.