As a child, presumably like many others, I was pretty enthralled by the concept of magic, magical formations, spells, and etc. And it was equally disheartening to not find oneself in such a world. But growing up, I realized that we already have magic in our world, which we call electricity. We have formations to harness its powers, which would be the chips, and we have spells aka coding to bring them all together to literally perform magic.
This dawned on me as well when I watched a Feynman video on electromagnetism. He talked about that you can explain the "how" of its working, but not the "why". The fundamental forces of the universe are basically magic fields. And this is true for anything that is fundamental and can't be deconstructed further into more primary components.
Danny Hillis makes a similar analogy in 'The Pattern on the Stone', where he reasons that if someone 200-300 years ago where told what he did for work he'd be burned as a witch.
I grew up with Harry Potter and now I express that I eventually became a wizard. I just scribble runes onto paper, then do clicky-clacks on my keyboard translating the runes into spells and make machines do magic.
I had a dream once where I somehow was in the far, far future. At first everything seemed like it had regressed or gone back to a more middle ages / middle earth type aesthetic. But then it became obvious that this was intentional - people had rejected a lot of technology and hidden others. So there were many things that did indeed seem like oldschool magic but were actually just powered behind the scenes by some very advanced tech.
> But growing up, I realized that we already have magic in our world, which we call electricity.
Today its kind of boring to most people (not me obviously) but imagine how magical it was upon discovery and propagation. A common need in industry is to turn a shaft which we now do with electric motors. Prior to the motor the only installable, on-demand source of rotational power was a steam engine. It had lots of moving parts, required a boiler, and fuel source that was either coal or wood and had to be carted in by horse. Then you had to hire skilled and trained operators to maintain the engines and the boilers.
With electric all you do is connect two or three wires to a hunk of iron and copper and a shaft supported by two bearings spins. Just make sure the bearings are oiled, brushes (DC motors were once common) are in proper order, and you're good to go. There is no dangerous combustion or flue gas, pipes, or burn hazards, no fuel storage or hazards, no bulky boiler and no need for boilermen. In an instant an invisible force is pushing the shaft around who's only wear items are serviceable bearings. If you needed light to see the machines these motors operated just connect two wires to a glass sphere which emitted a bright illuminating glow. That's magic.
Now thanks to modern high power transistors and smaller faster transistors in microchips we can tame this magical force much more accurately and cheaply enabling electric cars, led lighting, solar and renewables, high efficiency switching power supplies, and so much more.
All this thanks to the magical invisible force of electricity and electromagnetism tamed by our equally magical semiconducting devices. The modern world is literally moved by these devices.
I consider wireless to be black magic which is also enabled by really fast transistors.
I mean transistors are neat and all, but what really sets things apart for me is integrated circuits. When you look at the size of the discrete MOnSter 6502 CPU[1], featured here[2] recently, and realize it has around 3k transistors and the latest CPUs and GPUs have several billion...
And not just transistors, but analog circuits as well, allowing for extremely compact designs.
I think the transistor age would make more sense, it allows the age to cover a broader stretch of history. Think about things like the stone/bronze/iron ages, it's not any one new tool but the base technology itself. So if silicon gets supplanted I don't think we will stop using transistors.
But who knows with quantum photonics, maybe laser/optic circuits will supplant a large chunk of silicon transistors in the next 100 years
If you're going for a modern version of "Ages of Humanity" akin to Hesiod/Ovid, sure, if not Digital or Information.
But Stone/Copper/Bronze/Iron refer specifically to the dominant material for crafting weaponry. In that sense, maybe we're in the transition between Steel and Plastic, though maybe a case could be made for Lead.
Or we really are in the Nuclear Age, and it becomes the Final Age of Humanity.
Or war has gone digital, and we are in the Information Age, beyond terrestrial materials. In that case, true Space and then Atomic (universal assembler) Ages may follow.
The IBM 1401 was the most popular computer of the early 1960s, with over 10,000 produced. It was built from germanium (not silicon) transistors. So silicon shouldn't get all the credit.
Transistor, information, or automation could be good I think. Another option maybe — what really distinguishes our civilization right now is that we’ve integrated out economies around the entire planet, so “global age” might be a good candidate.
Silicon is present in sand, so we’ve been using it forever in ceramics, on some level. Like copper, it has been a good friend to humanity for a very long time. The only problem is that “using lots of SI” is not a distinguishing characteristic for an age!
We call it the Iron Age because iron is a grumpy, unfriendly element that didn’t want to help out until we got some pretty fancy forges.
Maybe we could use it to mark some larger scales. Silicon age could start around 25k years ago with the invention of pottery. Before that, I dunno, fire age or rock age.
No, I think the concept of ages is itself a historiographic fiction that peaked in the 20th century, as an attempt to impose order and narrative on the unstructured chaos of history. But they are one-dimensional, westcentric, and IMHO obsolete.
Historians of the future will probably refer to the dot-com bubble as the start of the Silicon Age. But another take could be that the silicon and transistor age started at the same time with commercial silicon transistors (mid 50s).
Only time will tell, but at least for now it seems like either will work for the next few thousand years. Iron is still important, and Bronze is still used, but Silicon seems the like backbone of our civilization now.
Whenever I read sci-fi stories and there is talk of self replicating robots/factories/machines or even generation ships, there is a huge jump in technology/magic that is never discussed - what technology is used to replace semiconductors - because a portable semiconductor fab seems so outside the realm of possibility now. It would be such a revolutionary change that it's outside the realm of speculation besides handwaving at bioengineered replacements or easy atomic level manipulation ala The Culture series.
My thinking is: there are many ways to make matter do compute. Semiconductor-based digital logic chips are only one of many possibilities.
You can make an analog computer from just about anything, it's a matter of identifying things you can interpret as flows and stores (or, as the professors who taught control systems at my uni, faucets, drains and bathtubs), and arranging them just right. You can make a digital computer from just about anything too - it's a matter of identifying things you can interpret as a NAND gate, and stacking them just right. You can encode a neural network model as grooves in plastic sheets, stack them into layers, and have it naturally process incoming light. Etc. The possibilities are limitless, because computing is more about your mental model, and less about the substrate itself.
So I imagine, the breakthrough for nanotech will be when someone figures out a way to make some kind of programmable computer that works at nanoscale, can be easily produced, and is somewhat robust against the environment. It doesn't need to be a fast computer - we're used to multi-GHz CPUs and multi-MHz microcontrollers, but at nanoscale, even an equivalent of an old 8008 will do, or even something much weaker: single nanobot doesn't need much compute. You'll be deploying them by thousands or millions at a time anyway, and you'll want to program them as a swarm anyway.
I assume in those stories that nanotech has displaced photolithography in the fabrication of microelectronics. If you can get nanoassemblers to reliably produce a simple CPU, you can pick up the slack through software and bootstrap from there.
Fair, though even still, those are hidden away in little plastic boxes. The actual meat of a transistor isn't really visible except by analogue, eg observing an equivalent behaviour going on inside a glass vacuum tube.
This article has a link to the IEEE electonic devices meeting, and flipping through the presentations, optical interconnects come up, leading to the still rather sci-fi notion of the photonic computer:
> "In principle, all-optical digital signal processing and routing is achievable using optical transistors arranged into photonic integrated circuits. The same devices could be used to create new types of optical amplifiers to compensate for signal attenuation along transmission lines."
Indeed.. just completed "Chip War" by Chris Miller which delves into the history and the Geopolitics of this wonderful invention. After WWII Japan, Singapore, Taiwan, South Korea owe their prosperities to the transistor.
I can't but think of this analogy: Transistors are like Judo. You can use them by keeping your feet firmly on the ground, then applying a small amount of energy to a point (base current) you can control and use to your advantage a much bigger energy that comes from the opponent (collector current through the load).
I'd be willing to say electromagnetic waves and signals are the true invisible, magic infrastructure. But of course transistors and ICs do most of the magic.
[+] [-] pkoird|3 years ago|reply
[+] [-] colordrops|3 years ago|reply
[+] [-] A-Train|3 years ago|reply
http://www.telegraph.co.uk/culture/books/authorinterviews/10...
[+] [-] pjmorris|3 years ago|reply
[+] [-] calebm|3 years ago|reply
[+] [-] bitwize|3 years ago|reply
[+] [-] PartiallyTyped|3 years ago|reply
[+] [-] Melatonic|3 years ago|reply
[+] [-] km3r|3 years ago|reply
[+] [-] MisterTea|3 years ago|reply
Today its kind of boring to most people (not me obviously) but imagine how magical it was upon discovery and propagation. A common need in industry is to turn a shaft which we now do with electric motors. Prior to the motor the only installable, on-demand source of rotational power was a steam engine. It had lots of moving parts, required a boiler, and fuel source that was either coal or wood and had to be carted in by horse. Then you had to hire skilled and trained operators to maintain the engines and the boilers.
With electric all you do is connect two or three wires to a hunk of iron and copper and a shaft supported by two bearings spins. Just make sure the bearings are oiled, brushes (DC motors were once common) are in proper order, and you're good to go. There is no dangerous combustion or flue gas, pipes, or burn hazards, no fuel storage or hazards, no bulky boiler and no need for boilermen. In an instant an invisible force is pushing the shaft around who's only wear items are serviceable bearings. If you needed light to see the machines these motors operated just connect two wires to a glass sphere which emitted a bright illuminating glow. That's magic.
Now thanks to modern high power transistors and smaller faster transistors in microchips we can tame this magical force much more accurately and cheaply enabling electric cars, led lighting, solar and renewables, high efficiency switching power supplies, and so much more.
All this thanks to the magical invisible force of electricity and electromagnetism tamed by our equally magical semiconducting devices. The modern world is literally moved by these devices.
I consider wireless to be black magic which is also enabled by really fast transistors.
[+] [-] darepublic|3 years ago|reply
[+] [-] amelius|3 years ago|reply
Yeah but the magic happens mostly behind a flat surface that we call a screen.
[+] [-] unknown|3 years ago|reply
[deleted]
[+] [-] loloquwowndueo|3 years ago|reply
[+] [-] akeck|3 years ago|reply
[+] [-] magicalhippo|3 years ago|reply
And not just transistors, but analog circuits as well, allowing for extremely compact designs.
[1]: https://monster6502.com/
[2]: https://news.ycombinator.com/item?id=33841901
[+] [-] doctorwho42|3 years ago|reply
But who knows with quantum photonics, maybe laser/optic circuits will supplant a large chunk of silicon transistors in the next 100 years
[+] [-] maxwell|3 years ago|reply
But Stone/Copper/Bronze/Iron refer specifically to the dominant material for crafting weaponry. In that sense, maybe we're in the transition between Steel and Plastic, though maybe a case could be made for Lead.
Or we really are in the Nuclear Age, and it becomes the Final Age of Humanity.
Or war has gone digital, and we are in the Information Age, beyond terrestrial materials. In that case, true Space and then Atomic (universal assembler) Ages may follow.
[+] [-] mycall|3 years ago|reply
[+] [-] kens|3 years ago|reply
[+] [-] bee_rider|3 years ago|reply
Silicon is present in sand, so we’ve been using it forever in ceramics, on some level. Like copper, it has been a good friend to humanity for a very long time. The only problem is that “using lots of SI” is not a distinguishing characteristic for an age!
We call it the Iron Age because iron is a grumpy, unfriendly element that didn’t want to help out until we got some pretty fancy forges.
Maybe we could use it to mark some larger scales. Silicon age could start around 25k years ago with the invention of pottery. Before that, I dunno, fire age or rock age.
[+] [-] layer8|3 years ago|reply
[+] [-] loloquwowndueo|3 years ago|reply
[+] [-] jl6|3 years ago|reply
[+] [-] mkl95|3 years ago|reply
[+] [-] bluGill|3 years ago|reply
[+] [-] euroderf|3 years ago|reply
[+] [-] johnohara|3 years ago|reply
In section 3, Reducing the Size, he speculates having a billion transistors in a computer, then immediately qualifies his statement.
That was almost thirty years ago.
[+] [-] stevenwoo|3 years ago|reply
[+] [-] TeMPOraL|3 years ago|reply
You can make an analog computer from just about anything, it's a matter of identifying things you can interpret as flows and stores (or, as the professors who taught control systems at my uni, faucets, drains and bathtubs), and arranging them just right. You can make a digital computer from just about anything too - it's a matter of identifying things you can interpret as a NAND gate, and stacking them just right. You can encode a neural network model as grooves in plastic sheets, stack them into layers, and have it naturally process incoming light. Etc. The possibilities are limitless, because computing is more about your mental model, and less about the substrate itself.
So I imagine, the breakthrough for nanotech will be when someone figures out a way to make some kind of programmable computer that works at nanoscale, can be easily produced, and is somewhat robust against the environment. It doesn't need to be a fast computer - we're used to multi-GHz CPUs and multi-MHz microcontrollers, but at nanoscale, even an equivalent of an old 8008 will do, or even something much weaker: single nanobot doesn't need much compute. You'll be deploying them by thousands or millions at a time anyway, and you'll want to program them as a swarm anyway.
[+] [-] unknown|3 years ago|reply
[deleted]
[+] [-] speed_spread|3 years ago|reply
[+] [-] infiniteUnivers|3 years ago|reply
* glances over at bins of assorted TO-92 and TO-220's *
checkmate, Mr. Goldstein.
[+] [-] mikepurvis|3 years ago|reply
[+] [-] carapace|3 years ago|reply
The Haber-Bosch Process, without which none of us would be here reading this today.
The transistor. 'natch.
Rare earth magnets (which enable small strong motors, which enable tiny drones and factories, which revolutionizes economics.)
[+] [-] Yahivin|3 years ago|reply
[+] [-] photochemsyn|3 years ago|reply
https://en.wikipedia.org/wiki/Optical_transistor
> "In principle, all-optical digital signal processing and routing is achievable using optical transistors arranged into photonic integrated circuits. The same devices could be used to create new types of optical amplifiers to compensate for signal attenuation along transmission lines."
[+] [-] joshbaptiste|3 years ago|reply
[+] [-] Victerius|3 years ago|reply
"It is a strange fate we should suffer so much fear and doubt… over so small a thing. Such a little thing."
Where the little thing here is the transistor.
[+] [-] the_chatman|3 years ago|reply
[deleted]
[+] [-] sockaddr|3 years ago|reply
[+] [-] euroderf|3 years ago|reply
[+] [-] squarefoot|3 years ago|reply
[+] [-] glasss|3 years ago|reply
[+] [-] ThrowawayTestr|3 years ago|reply
[+] [-] Aissen|3 years ago|reply