(no title)
nixos | 9 years ago
The opposite. When the field was in its infancy, one was able to keep whole stacks in his head.
How complicated were CPUs in the 1960s?
How many lines of assembler was in the LM?
How many lines is Linux or FreeBSD kernel? Now add libc.
Now you have a 1970s C compiler.
Now take into account all the optimizations any modern C compiler does. Now make sure there's no bugs _there_.
Now add a Python stack.
Now you can have decent, "safe" code. Most hacks don't target this part. The low hanging fruit is lower.
You need a math library. OK, import that. You need some other library. OK, import that.
Oops, there's a bug in one module. Or the admin setup wasn't done right. Or something blew.
Bam. You have the keys to the kingdom.
And this is all deterministic. Someone _could_ verify that there are no bugs here.
But what about Neural Networks? The whole point of training is that the programmers _can't_ write a deterministic algorithm to self drive, and have to have a huge NN do the heavy lifting.
And that's not verifiable.
_This_ is what's going to be running your self-driving car.
That's why I compared software engineering to biology, where we "test" a lot, hope for the best, and have it blow up in our face a generation later.
dustingetz|9 years ago