top | item 10707442

Google, D-Wave, and the case of the factor-10^8 speedup

312 points| apsec112 | 10 years ago |scottaaronson.com | reply

110 comments

order
[+] calhoun137|10 years ago|reply
I found this remark from the linked pdf[1] very interesting:

"...it has also been shown that for the class of problems used in initial benchmarks,the performance of the devices also correlates well with a classical(meanfield) approximation of quantum annealing [4],which raises questions about whether the devices can outperform classical optimizers."

In other words, if you ran a simulation of this device on a normal computer and gave it the same problem to solve, you will get basically the same efficiency.

[1] http://www.scottaaronson.com/troyer.pdf

[+] BrainInAJar|10 years ago|reply
Not only that, the classical simulation of the machine is faster than the real thing, because the simulation simulates an ideal error-free machine, which is not the case
[+] mtgx|10 years ago|reply
The question is can they still say that when D-Wave reaches say 4,000 qubits? What I'm asking is will classical optimizers and classical computers always be able to be "roughly as fast" as D-Wave for that work, or will D-Wave inevitably surpass them by orders of magnitude in future iterations?
[+] tormeh|10 years ago|reply
So there are huge algorithmic speedups just sitting there for the taking? And no one has used these? Color me skeptical.
[+] SilasX|10 years ago|reply
So ... more proof that D-Wave is fluff.
[+] Fede_V|10 years ago|reply
Scott Aaronson is fucking awesome. Whenever there's a new paper claiming breakthroughs about quantum computing or P=NP, I go to his blog and wait for him to clearly explain it.

He is the very rare case of a top notch scientist with a gift for explaining things well.

[+] bhouston|10 years ago|reply
It isn't that clear but Scott Aaronson did make an admission in this article that I do not believe he has made before -- he admitted the D-Wave compute exhibited quantum effects:

"As far as I’m concerned, this completely nails down the case for computationally-relevant collective quantum tunneling in the D-Wave machine"

I believe many of his previous blog posts had argued strongly that there was no proof of any quantum effects in D-Wave computers, but I could be misremembering.

[+] dvanduzer|10 years ago|reply
Scott Aaronson's entire professional life is essentially to distinguish what could and couldn't be a quantum effect in computation.

So yes, your memory is accurate that his previous blog posts argue strongly that nothing D-Wave computers had done proved any quantum effects. His acknowledgement that they've achieved quantum computation is a compliment to their engineering prowess. But he is complimenting new engineering results.

[+] lmm|10 years ago|reply
It's not an "admission" - there wasn't proof before (Aaronson never claimed that the effects weren't computationally relevant, only that they hadn't been proven to be) and now there is.
[+] coldcode|10 years ago|reply
Can someone explain this to those of use who have no clue what these calculations are good for? If this machine is faster to do something, what would you use that something for?
[+] mcguire|10 years ago|reply
"If this machine is faster to do something..."

That's the point; is the machine faster? If the machine is actually doing quantum computing, it won't slow down as much for larger problems (for some problems). But they haven't been able to prove that it's doing quantum magic.

[+] lmm|10 years ago|reply
At the moment, it's only faster for the useless task of simulating itself. The hope is that that can be extended to more practical questions, but for now it's a pure research project.
[+] adrianbg|10 years ago|reply
D-Wave's computers work as intended and exhibit "computationally relevant quantum effects" but we still don't know whether the strategy they are using will ever yield practical advantages. Other groups (Google, University of Maryland) seem likely to achieve genuine quantum speedups over best-known classical algorithms within the next few years.
[+] raverbashing|10 years ago|reply
Yes

But that's the evolution of technology. First cars were not a significant advantage over horses

Sometimes it is much more efficient from the start (like transistors) but this is rare.

[+] sytelus|10 years ago|reply
While Scott Aaronson is highly respected, I think we are rushing in to put down D-Wave guys. They seem very conservative in their claims. We should celebrate that we have come even this far that we are actually constructing these machines that were just research fantasies few decades ago and are at a point where we can even think about doing comparison with best we have built. I don't think anyone in their right mind thinks D-Wave or other QC will replace classical machines in next 5-10 years period. It's the progress that we need to be positive about and build on each other's work.
[+] Strilanc|10 years ago|reply
> They seem very conservative in their claims.

Hahaha!

D-Wave's reputation is one of exaggeration and putting marketing before science. That's how Scott ended up as de-facto "chief D-Wave skeptic": he kept complaining about their exaggerations and the exaggerations of journalists covering them.

(The paper being discussed now does actually seem conservative in its claims, but it's by a team at Google; not D-Wave.)

[+] touchofevil|10 years ago|reply
I'm wondering if this will allow for a huge speed up in 3D rendering? Specifically, unbiased Monte Carlo ray-tracing / path-traching renderers like Maxwell Render. Does anyone know of any work being done in this area of Quantum computing?
[+] jerf|10 years ago|reply
"Does anyone know of any work being done in this area of Quantum computing?"

Work in this area is still primarily focused on making it work at all. For instance, it isn't called out in the linked blog since by now Scott probably considers it basic background information, but D-Wave only solves a very particular problem, and it is both not entirely clear that it has a superior solution to that problem than a classical algorithm can obtain and it is not clear that encoding real problems into that problem will not end up costing you all of the gains itself. Really pragmatic applications are still a ways into the future. It's hard to imagine what they might be when we're still so early in the process, and still have no good idea what either the practical or theoretical limits are.

[+] efangs|10 years ago|reply
Quantum annealing could "potentially" provide a speedup for any optimization problem that can be reduced to a spin Ising form, particularly quadratic unconstrained binary optimization (QUBO).

Notice the emphasis on potentially, though. This paper only shows that 1) for a particular class of problems the quantum annealer has constant speedup over one current classical algorithms, and 2) the quantum annealer scales better for number partitioning than a few current classical algorithms.

[+] lmm|10 years ago|reply
The popular perception of quantum computers as "doing things in parallel" is very misleading. A quantum computer lets you perform computation on a superposed state while maintaining that superposition. But that only helps if the structure of the problem lets you somehow "cancel out" the incorrect results leaving you with the single correct one. You can't actually run a bunch of calculations in parallel and measure all of the results; it's more like you can, in some limited cases, make it as though you'd "magically" made the right choice to start with.
[+] biot|10 years ago|reply
The phrasing of "factor-10^8 speedup" threw me for a bit. Is that a negative speedup, meaning it's significantly slower? No, it's actually "10^8 faster".
[+] HappyTypist|10 years ago|reply
They probably mean the - like "Factor-100 speedup".
[+] lovelearning|10 years ago|reply
Any book that gives an overview of quantum computing to a programmer? A sort of "thing explainer" for quantum computing?
[+] cabinpark|10 years ago|reply
What's a thing explainer? But unfortunately since quantum computing is still really only an academic pursuit, you're only going to find technical literature or dumbed down to the point of not being worth it.

Saying that, there is one book I like called Quantum Processes Systems, and Information by Schumacher (who coined the term qubit) and Westmoreland since it is an introductory text and has a good discussion of the basics of quantum computing. It does, however, assume you are a physics student with a good understanding of linear algebra.

[+] lmm|10 years ago|reply
If you want to actually understand it (complete with exercises), Nielsen and Chuang (maybe go for a second-hand copy, the editions don't change that much and it's pretty expensive new). It's not ELI5 though.
[+] arthurcolle|10 years ago|reply
TL;DR

"Finally, on the special Chimera instances with the tall, thin energy barriers, the authors find that the D-Wave 2X reaches the global optimum about 108 times faster than Quantum Monte Carlo running on a single-core classical computer. But, extremely interestingly, they also find that this speedup does not grow with problem size; instead it simply saturates at ~108. In other words, this is a constant-factor speedup rather than an asymptotic one."