top | item 9039235

Possible unconventional computing techniques of the future

53 points| diego898 | 11 years ago |nautil.us | reply

37 comments

order
[+] ChuckMcM|11 years ago|reply
I was expecting a bit more from nautil.us. There are a lot of alternate computation mechanisms but they don't lend themselves to the environments where we put computers (you're not going to run a satellite on slime mold). Building better biologic diagnostics? Sure. But in the computer space where Moore's Law is often cited, graphene or other carbon structures, or even silicene seem more likely and 3D structures even more so.
[+] aidenn0|11 years ago|reply
A little rewording, and we get:

> Transistor computing is naturally parallel, aidenn0 says, with computations taking place simultaneously at every logic gate

[+] farresito|11 years ago|reply
Boy, I should stop reading HN. The more I read about the scientific advancements that are going on in this field, the more I think about getting a second degree in, say, biology, to try to work on biological computers or stuff like that.
[+] logicallee|11 years ago|reply
in my opinion moore's law is already fine for another 15 years.

(15 years / 18 months = 10, so 10 iterations). moore's law is about transistor count.

the reason it's fine to have up to 1024x as many transistors (2^10, since moore's law is about doubling):

>"Moore's law" is the observation that, over the history of computing hardware, the number of transistors in a dense integrated circuit doubles approximately every two years.

oh, I see it says 2 years ,not even 18 months as i'd thought.

anyway the reason it will continue just fine is that we are using one tiny few-millimeter slice (plane) to print transistors onto whereas obviously by 2035 there will be multiple layers (3d). this is totally obvious since

https://www.google.com/search?q=c+%2F+4+ghz

and we're already at 7.49481145 centimeters travelled in a clock cycle. etchings are at 14 nm already. a molecule of carbon nanotube has its width at 4 nm. carbon itself (the atom) only has an atomic radius of 0.14 nm, and you're not going to be inserting your etchings into the quarks and protons of atoms and hoping to do your calculation there.

it's completely obvious that before very long we'll take those 8 centimeters light is travelling per cycle @ 4 ghz and have it other than zigzag across a plane, a single slice, when we can stack thousands of them.

there's nothing wrong with moore's law except the fact that people are too good at shrinking die size, so it'll be a while for them to think inside the box. (outside the plane.)

[+] nhaehnle|11 years ago|reply
You're right that going 3D is a natural thing, but there appear to be pretty serious manufacturing challenges. The most obvious one: chips already take a pretty long time to manufacture - apparently the "latency" of a typical fab is on the order of weeks. If you double the number of layers, you double this latency.

So the only way to get serious about 3D is to produce thin slices in parallel and then put them together somehow. As a corollary, this means that we're most likely never going to see a logical unit such an entire core being distributed across two layers of transistors.

This doesn't invalidate your point, of course. We're likely to see designs where multiple dies, each with several cores, are stacked on top of each other.

[+] jhallenworld|11 years ago|reply
"A ternary lookahead algorithm does not yet exist, and other algorithms that would be needed for ternary to be practical are also missing."

Really? I'm pretty sure carry select look-ahead would work for ternary.

[+] jerf|11 years ago|reply
What a strange hook. "Moore's law is running out and we soon won't be able to squeeze more performance out of transistors, so... here's a bunch of computing technologies that may have a niche but stand no chance of having better performance than transistors even when optimized." It's a fine article, but the hook doesn't match it.
[+] tacotime|11 years ago|reply
I agree with you 95%. The 5% being an admittedly far fetched and idyllic notion that just maybe one day we will design the mythical x86 bacterium and all of a sudden instead of needing meticulously crafted silicone wafers and logic gates... all of a sudden all we need is cheap simple hydrocarbon to grow the world's most elaborate super computers. They wouldn't be faster but more efficient and highly parallel. Like a more distributed version of a brain where bacterial (or whatever) cells take the place of traditional neurons.
[+] dang|11 years ago|reply
Yes, that is a disappointingly baity title. We changed it to one that attempts to describe the article accurately, but if anyone can suggest a better, we'll change it again.
[+] mwcampbell|11 years ago|reply
I think I would prefer it if progress in hardware efficiency just stopped when Moore's Law finally becomes invalid. Then hardware wouldn't become obsolete so quickly. That would surely be good for the environment, and for the poor.
[+] sergiosgc|11 years ago|reply
I'm having trouble counting the ways this comment is invalid. It seems to be invalid at each word:

1) Hardware does not become obsolete so quickly: It is good that hardware becomes obsolete. I can't for the life of me imagine that the world would be a better place if state of the art CPUs were the 8Mhz Z80 my computing life started with.

2) Slower CPUs would be good for the environment: How? By needing more iron for a given task? More energy? Is it by disallowing hugely detailed climate simulations? How???

3) Slow CPUs would be good for the less economically privileged: The cheapest CPU that can be bought today is some orders of magnitude faster than an 8Mhz Z80. How would the less privileged be better off if CPU evolution had stopped in the early 80s?

[+] corysama|11 years ago|reply
I never understood people who complain about computers rapidly and continuously getting better and cheaper. That's an incredibly short-sighted point of view. Your old computer would still run Windows 3.1 the same as it always did if you hadn't traded it up for something better. The creation of better machines didn't make your old machine any worse except through your raised aspirations. The poor have benefited immensely from Moore's Law and related effects. What used to require a corporate datacenter can now be had in a cheapo android tablet. The environment has benefited similarly through greatly improved computer-aided design and logistical efficiency. Wishing for that process to stop is a highly self-destructive line of thought.

If the problems of e-waste and computational access for the poor are issues you care about then go work for a computer recycle/reuse center. While you're there, cheer on Moore's Law for bringing better and better tech to the center's inbox by making it so cheap that people are happy to give it away for free!

[+] comboy|11 years ago|reply
And once again we would learn how to write efficient software, yay.

I can't imagine this happening though (before another paradigm shift).

[+] coldtea|11 years ago|reply
Yeah, and software authors would have to get better and more efficient too...
[+] tobylane|11 years ago|reply
What about the sizeable parts of capitalism that aren't environmentally focused or that are economically comfortable?

What about the even poorer people who only receive new hardware when richer people buy upgrades?

[+] JoeAltmaier|11 years ago|reply
Physical hardware has a lifetime of only a few years (motherboard components, heat and moisture). BUt yeah if the designs could last longer than a prime time sitcom that would be nice.
[+] Yakimoto|11 years ago|reply
>less economically privileged.

wow