(no title)
tikej | 3 years ago
Since it in principle never makes mistakes (in practice there are of course bugs, but they are usually different in nature than human errors) it changes what is possible and most convenient. You no longer have to optimise for simplicity as heavily for example. On the other hand computers basically can’t deal with ambiguity, so the rules and statements have to be stated very simply and clearly.
EDIT: One example that comes to mind are indexes in functions. Usually they are just additional arguments that are different somehow from the “main” arguments, for example often being non-negative integers. For humans it makes it easier to think and operate about indices separately from the rest of arguments. But for the computer it’s all the same, as all arguments are treated just as argument, (of course it depends on the implementation etc) and there is no need to treat them separately, since every argument is “special”.
I believe computers can change the landscape of what’s best notation. This is an interesting, interdisciplinary topic to explore.
Qem|3 years ago
hilbert42|3 years ago