(no title)
bwarp | 14 years ago
To be honest, the "complete understanding" died way before the original post. It was the moment that they started building computers with integrated circuits inside them (before 1970) and complexity rocketed logarithmically. It died doubly so the moment that electrical engineering and the practical side of computer science diverged into "them" and "us".
If you want to use something which you understand, you will probably need to buy a crate of transistors, resistors and diodes and wirewrap yourself an entire computer from scratch PDP-7 style.
This fact is a warning: We really are building things with too deep abstraction hierarchies causing knowledge to be divided. One day we will never hope to comprehend anything in a lifetime.
Confusion|14 years ago
This is a semantic discussion about the meaning of 'understanding': does it mean you can globally explain how the system works and could come to understand the smallest detail of every part? Or does it mean you understand the smallest detail of every part?
The latter is a nonsensical definition: if that is the case, then nobody understands processors, because nobody understands transistors, because nobody understands quantum mechanics, because nobody understands why the fundamental forces act in certain ways. Nobody understands Newton's laws, nobody understands where babies come from and nobody understands what it means to perform a 'computation'[1].
Of course, that means the former was also a nonsensical defintion.
[1] http://plato.stanford.edu/entries/church-turing/
JackC|14 years ago
The interesting way to construe the article's claim is not that it's impossible to know everything, but that's impossible to know everything that people already know about the field you work in.
Were there blacksmiths who knew everything anyone knew about forging swords? Did Newton or Da Vinci know everything anyone knew about the various fields they were expert in? Are there farmers now who know everything anyone knows about how farming works? The article claims that at some point it became a certainty that programmers cannot know everything that anyone knows about how to use the tools they use and what those tools do. The stack is too complex. That's at least a sensible and interesting claim.
bwarp|14 years ago
To be honest, the bedrock abstraction should stop at "what humans can realistically create with their own hands from nothing". You can make your own transistor quite easily and Ebers-Moll provides a nice set of rules to work with.
The quantum physicists and philosophers can remain arguing about technicalities then and let the rest of the world observe, understand and create.
gaius|14 years ago
jgw|14 years ago
Interesting historical point - the BBC Micro was designed by Acorn Computers, who are the parent of the ARM processors that are so ubiquitous today.
psquid|14 years ago
bwarp|14 years ago
I (the parent of your post) actually have a BBC Master (and the advanced reference manuals) lying around still for precisely that reason. It's quite a handy and very powerful little machine to be honest.
It even runs LISP (AcornSoft LISP).
eterps|14 years ago
bwarp|14 years ago
3d printers are supposedly promoted as printing themselves i.e. as self-replicating. They are not. They print a small fraction of their own non-complex parts.