Seems an optimistic timeline to 'reinvent' memory. They have perhaps forgotten about 'bubble memory' which was going to 'eliminate disks' in 1982.
From the article : "As reported by The Register, at a recent conference in Oxnard, California, HP’s Stan Williams said that commercial memristor hardware will be available by the end of 2014 at the earliest."
So basically we'll get to see real devices perhaps at the end of 2014 (I'm guessing closer to 2017 but we'll see) And we need a couple of years of building/using/repairing them before we see wide spread design wins, then another year before 'memristor' enabled devices hit the market and are or are not competitive.
That said, I'm rooting for them to be successful, flash is in a bad way at the moment with feature size being a hard limit on cell lifetimes.
Bubble memory was doing just fine until a Nobel-winning physics breakthrough allowed the existence of high-density hard drives. If memristors do as well as bubble memory, there will have to be a serious fluke for it not to replace everything on the market. https://en.wikipedia.org/wiki/Bubble_memory#Commercializatio...
Some of the controversy is about priority, which may not matter so much; I care less about whom I get massive on-chip non-volatile storage from than that I get it at all. But that too is under dispute (e.g. "Myth #3" in the second link above). So it's a little distressing to see signs of vaporware from HP at this point. I really want this!
Also, the OP links to a post (http://www.theregister.co.uk/2012/07/09/hp_memristor_and_pho...) in which Williams says something odd: "We're not going to make money off these chips. We are going to make money by building cool systems utilising these chips." If his memristor claims are true, the chips themselves could hardly be more of a game changer – worth billions to put it mildly.
> Historically, electrical circuits were crafted with three basic building blocks: the capacitor, the resistor, and the inductor. But in 1971, University of California at Berkeley professor Leon Chua predicted the existence of a fourth: the memristor, short for memory resistor.
> Then, in May of 2008, HP announced that it had actually built a memristor, thanks to HP Labs Fellow R. Stanley Williams and others working in the company’s research arm.
Here's a great presentation from R. Stanley Williams[1], via prior discussion at HN[2].
What I take from this presentation is much, much more than lots of memory. Apparently, these things can also do processing, like an FPGA. The result would be a whole system on a chip, only the design is almost entirely pushed up at the "software" level.
To me, that's even more game changing than a mere petabyte on a square inch. "Specialized" hardware is made easy. Complete re-purposing becomes possible. Most hardware compatibility issues just go away. And of course, it will be much, much easier to experiment with novel and totally crazy architectures.
Oh, and they say it can re-configure itself on the fly, very rapidly. So, it's not just easy innovations on hardware. It's metamorph hardware. Imagine for instance code that compiles down to logic gates, instead of very high level assembly code, possibly on the fly. Or, re-allocating hardware resources by the second, depending on your needs.
> “Our partner, Hynix, is a major producer of flash memory, and memristors will cannibalise its existing business by replacing some flash memory with a different technology,” he said. “So the way we time the introduction of memristors turns out to be important. There’s a lot more money being spent on understanding and modeling the market than on any of the research.”
Surely if they wait too long purely for profitable reasons the market will punish them by someone else beating them to it?
As bad as that sounds, I don't think that HP is currently making much money off flash memory, so they still have an incentive to get the memristors to market soon. They just have chosen to make enough concessions to secure the cooperation of a DRAM manufacturer so that HP doesn't have to build the whole memristor business themselves. (The way HP's been managed in recent years, I wonder if they would be able to get and hold on to enough capital to mass produce memristors and bring them to market without help.)
But HP would never attempt to maximize for profit right? The memristor is going to be its gift to humanity in the same way they've kept ink-jet and toner prices right above their costs.
I think that statement about spending more (time or money?) studying the market than doing research is pretty sad.
That passage leapt out at me too. In my simplistic geek mind this is totally arse-about-face, like the tail wagging the dog. Are these companies more interested in money than what they're actually doing?
Aficionados of weird co-incidence might be interested to learn that the researcher who discovered the memristor, Leon Chua, also has a daughter who is rather famous for having written "Battle Hymn of the Tiger Mother" (http://en.wikipedia.org/wiki/Amy_Chua). Leon Chua is also an enthusiast of Stephen Wolfram's NKS, and spoke at one of the early NKS conferences.
They should starting to bring out developer packages asap, because memory is not the only thing you can do with it as you can see from the presentations and papers. I find the stuff it can do BESIDES memory actually more interesting and really would like to get experimenting with it.
Even if the base circuit elements are made of memristors instead of transistors, the traces on top of them still need to be laid out in the traditional manner. So the fixed cost of bringing even a small memristor device to market is millions.
Memristors are not some magic stuff that suddenly makes FPGAs as efficient as ASICs, nor do they make then dramatically cheaper.
So then memristors remain "in the future". sigh. I was naiive enough to believe them a year ago when they said "it'll be on the market in 18 months". I was really looking forward to that being true and ditching RAM in six months.
It may be 18 months in the RAM world but in memresistor world it is as long as you like and still holding the 18 months value :).
That all said you are right and sadly come 2014 it will still not be available for your PC to replace your RAM as it will take that long again to define a standard and then that time again to have chipsets that support said standard.
So for a consumer PC, i'd say 2017 is when you can look at replacing your RAM and by that time PC's will probably be reduced to devices were you can't change the RAM and as such you wont know what you have got inside. But who knows for sure and if they do then they would proabably get done for insider share dealing just by telling people.
Only think we know for sure is the here and now and that any dat that has no product release date set in stone is a date that has not had the engineering factor of x2 and is a marketing factor of x.5. IE Engineers double the amount of time it will realy take and markting will half the amount of time it will realy take. Even then Engineers thesedays need to realy use x3.
Still - once we see a prototype working in a form factor we can identify with and running windows or a comercial OS of any form, then and only then can we feel that it will be available within 2 years. Anything else is pure marketing by people who will be working elsewere in years time.
Though I would love to be proved wrong on this, truely.
I'm more surprised at how surprised so many people seem to be at this. This isn't a college student building a music-sharing app here. This is a huge corporation preparing to deliver disruptive technology. Naturally they're doing enormous amounts of market research.
Memristors have "been around" in laboratory settings for a few years now. The research is done, now it's time to figure out how to market it. Seems pretty reasonable to me. This is actually very exciting to me, because the concept of a memristor, that of a large amount of small devices storing chunks of a larger piece of memory, is supposedly the same or a similar way that your brain's memory works.
I think memristors are a real contender for future devices; the darpa synapse reseaarch has shown that it's already a big backbone for future ai devices.
I wonder if the first uses of RRAM, when its still fairly expensive and low density, will be as a coalescing cache in front of big blocks of flash memory? Or maybe just for the page table.
“Development costs at least 10 times as much as research, and commercialization costs 10 times as much as development. So in the end, research — which we think is the most important part — is only 1 percent of the effort.”
This just means that it's easy to get something working in a lab but it's hard to make a product that you can mass produce. In non-voitile memory a flash replacement is always 2 years out, just like cold fusion is always 10 years out.
That was my take, too. I've been hearing about commercial memristors for at least a couple decades now, and they're forever on the cusp of showing up on my desktop.
[+] [-] ChuckMcM|13 years ago|reply
From the article : "As reported by The Register, at a recent conference in Oxnard, California, HP’s Stan Williams said that commercial memristor hardware will be available by the end of 2014 at the earliest."
So basically we'll get to see real devices perhaps at the end of 2014 (I'm guessing closer to 2017 but we'll see) And we need a couple of years of building/using/repairing them before we see wide spread design wins, then another year before 'memristor' enabled devices hit the market and are or are not competitive.
That said, I'm rooting for them to be successful, flash is in a bad way at the moment with feature size being a hard limit on cell lifetimes.
[+] [-] sp332|13 years ago|reply
[+] [-] gruseom|13 years ago|reply
http://vixra.org/abs/1205.0004
http://www.slideshare.net/blaisemouttet/mythical-memristor
Some of the controversy is about priority, which may not matter so much; I care less about whom I get massive on-chip non-volatile storage from than that I get it at all. But that too is under dispute (e.g. "Myth #3" in the second link above). So it's a little distressing to see signs of vaporware from HP at this point. I really want this!
Also, the OP links to a post (http://www.theregister.co.uk/2012/07/09/hp_memristor_and_pho...) in which Williams says something odd: "We're not going to make money off these chips. We are going to make money by building cool systems utilising these chips." If his memristor claims are true, the chips themselves could hardly be more of a game changer – worth billions to put it mildly.
[+] [-] kbd|13 years ago|reply
> Then, in May of 2008, HP announced that it had actually built a memristor, thanks to HP Labs Fellow R. Stanley Williams and others working in the company’s research arm.
Here's a great presentation from R. Stanley Williams[1], via prior discussion at HN[2].
[1] http://www.youtube.com/watch?v=bKGhvKyjgLY&sns=em
[2] http://news.ycombinator.com/item?id=3088739
[+] [-] loup-vaillant|13 years ago|reply
To me, that's even more game changing than a mere petabyte on a square inch. "Specialized" hardware is made easy. Complete re-purposing becomes possible. Most hardware compatibility issues just go away. And of course, it will be much, much easier to experiment with novel and totally crazy architectures.
Oh, and they say it can re-configure itself on the fly, very rapidly. So, it's not just easy innovations on hardware. It's metamorph hardware. Imagine for instance code that compiles down to logic gates, instead of very high level assembly code, possibly on the fly. Or, re-allocating hardware resources by the second, depending on your needs.
[+] [-] joshuahedlund|13 years ago|reply
Surely if they wait too long purely for profitable reasons the market will punish them by someone else beating them to it?
[+] [-] wtallis|13 years ago|reply
[+] [-] smoyer|13 years ago|reply
I think that statement about spending more (time or money?) studying the market than doing research is pretty sad.
[+] [-] pillock|13 years ago|reply
But to answer your point: Patents.
[+] [-] taliesinb|13 years ago|reply
[+] [-] SiVal|13 years ago|reply
[+] [-] tluyben2|13 years ago|reply
[+] [-] Tuna-Fish|13 years ago|reply
Memristors are not some magic stuff that suddenly makes FPGAs as efficient as ASICs, nor do they make then dramatically cheaper.
[+] [-] rbanffy|13 years ago|reply
[+] [-] freehunter|13 years ago|reply
[+] [-] Zenst|13 years ago|reply
That all said you are right and sadly come 2014 it will still not be available for your PC to replace your RAM as it will take that long again to define a standard and then that time again to have chipsets that support said standard.
So for a consumer PC, i'd say 2017 is when you can look at replacing your RAM and by that time PC's will probably be reduced to devices were you can't change the RAM and as such you wont know what you have got inside. But who knows for sure and if they do then they would proabably get done for insider share dealing just by telling people.
Only think we know for sure is the here and now and that any dat that has no product release date set in stone is a date that has not had the engineering factor of x2 and is a marketing factor of x.5. IE Engineers double the amount of time it will realy take and markting will half the amount of time it will realy take. Even then Engineers thesedays need to realy use x3.
Still - once we see a prototype working in a form factor we can identify with and running windows or a comercial OS of any form, then and only then can we feel that it will be available within 2 years. Anything else is pure marketing by people who will be working elsewere in years time.
Though I would love to be proved wrong on this, truely.
[+] [-] JackC|13 years ago|reply
[+] [-] pixie_|13 years ago|reply
[+] [-] mattdeboard|13 years ago|reply
[+] [-] tubbo|13 years ago|reply
[+] [-] batgaijin|13 years ago|reply
[+] [-] hosh|13 years ago|reply
[+] [-] Symmetry|13 years ago|reply
[+] [-] pbharrin|13 years ago|reply
This just means that it's easy to get something working in a lab but it's hard to make a product that you can mass produce. In non-voitile memory a flash replacement is always 2 years out, just like cold fusion is always 10 years out.
[+] [-] tsotha|13 years ago|reply