(no title)
mbell | 2 years ago
Hardware addition is filled with edge cases that cause it to not work as expected; I don't see it as that much different from the memory safety edge cases in most programming models. So by corollary is there no way to reason about any program that uses hardware addition?
Asooka|2 years ago
I am not aware of a single case where integer addition does not work as expected at the hardware level. Of course if you have a different expectation than what the hardware does it could be called "unexpected", but I would classify this more as "ignorance". I think we can reword it as "addition must be predictable and consistent".
This is not the case in standard C, because addition can produce a weird value that the compiler assumes obeys a set of axioms but in fact doesn't, due to hardware semantics. Most C compilers allow you to opt into hardware semantics for integer arithmetic, at which point you can safely use the result of addition of any two integer values. That is really the crux of the matter here - if I write "a = b + c", can I safely examine the value "a"? In C, you cannot. You must fist make sure that b and c fulfil the criteria for safe use with the "+" operator. Or, as is more usual, close your eyes and hope they do. Or just turn on hardware semantics and only turn them off for specific very hot regions of the code where you can actually prove that the input values obey the required criteria.
arcticbull|2 years ago
Kranar|2 years ago
That's the difference between something being safe and local but perhaps it's unexpected because you lack the knowledge about how it works, and something being unsafe because there is absolutely no way to know what will happen and the consequences can be global.
arcticbull|2 years ago
On the same hardware, yes, but the same C or C++ program may behave differently on different hardware specifically because the C abstract machine doesn't define what's supposed to happen. This leaves it up to (to your point) the compiler or the hardware what happens in, e.g., an overflow condition.
If you're planning on running the program on more than one CPU revision then I'd argue it introduces a similar level of risk, although one that's probably less frequently realised.