(no title)
brianpgordon | 6 years ago
The author makes it sound like the people working on optimizing compilers are deliberately seeking out these weird corner cases and selecting some random surprising behavior for them out of a hat, gleefully imagining how confusing it will be for end users. That's not how it works. Optimizers can be extraordinarily complex and need to maximize this ill-defined thing called "performance" in a highly multi-dimensional solution space. They ping-pong around inside this space constrained only by the specific requirements of the standard, and it's not surprising that some of the techniques used would produce some counter-intuitive results if the programmer is breaking the rules and relying on undefined behavior. It's kind of like if you trained a neural network to classify cat and dog pictures, and then you showed it a picture of a fire truck and expected it to give you a useful result.
The idea of a new version of the C standard that defines some of the most surprising undefined behavior is an interesting one though, and I'd be interested to see how much that really impacts the ability of the optimizer to do its work.
SAI_Peregrinus|6 years ago