I found this remark from the linked pdf[1] very interesting:
"...it has also been shown that for the class of problems used in initial benchmarks,the performance of the devices also correlates well with a classical(meanfield) approximation of quantum annealing [4],which raises questions about whether the devices can outperform classical optimizers."
In other words, if you ran a simulation of this device on a normal computer and gave it the same problem to solve, you will get basically the same efficiency.
Not only that, the classical simulation of the machine is faster than the real thing, because the simulation simulates an ideal error-free machine, which is not the case
The question is can they still say that when D-Wave reaches say 4,000 qubits? What I'm asking is will classical optimizers and classical computers always be able to be "roughly as fast" as D-Wave for that work, or will D-Wave inevitably surpass them by orders of magnitude in future iterations?
Scott Aaronson is fucking awesome. Whenever there's a new paper claiming breakthroughs about quantum computing or P=NP, I go to his blog and wait for him to clearly explain it.
He is the very rare case of a top notch scientist with a gift for explaining things well.
It isn't that clear but Scott Aaronson did make an admission in this article that I do not believe he has made before -- he admitted the D-Wave compute exhibited quantum effects:
"As far as I’m concerned, this completely nails down the case for computationally-relevant collective quantum tunneling in the D-Wave machine"
I believe many of his previous blog posts had argued strongly that there was no proof of any quantum effects in D-Wave computers, but I could be misremembering.
On the other hand, it's still not clear that this results in computationally relevant speedups. In my mind, this is not much of a shift in the status quo.
Scott Aaronson's entire professional life is essentially to distinguish what could and couldn't be a quantum effect in computation.
So yes, your memory is accurate that his previous blog posts argue strongly that nothing D-Wave computers had done proved any quantum effects. His acknowledgement that they've achieved quantum computation is a compliment to their engineering prowess. But he is complimenting new engineering results.
It's not an "admission" - there wasn't proof before (Aaronson never claimed that the effects weren't computationally relevant, only that they hadn't been proven to be) and now there is.
Can someone explain this to those of use who have no clue what these calculations are good for? If this machine is faster to do something, what would you use that something for?
That's the point; is the machine faster? If the machine is actually doing quantum computing, it won't slow down as much for larger problems (for some problems). But they haven't been able to prove that it's doing quantum magic.
At the moment, it's only faster for the useless task of simulating itself. The hope is that that can be extended to more practical questions, but for now it's a pure research project.
D-Wave's computers work as intended and exhibit "computationally relevant quantum effects" but we still don't know whether the strategy they are using will ever yield practical advantages. Other groups (Google, University of Maryland) seem likely to achieve genuine quantum speedups over best-known classical algorithms within the next few years.
While Scott Aaronson is highly respected, I think we are rushing in to put down D-Wave guys. They seem very conservative in their claims. We should celebrate that we have come even this far that we are actually constructing these machines that were just research fantasies few decades ago and are at a point where we can even think about doing comparison with best we have built. I don't think anyone in their right mind thinks D-Wave or other QC will replace classical machines in next 5-10 years period. It's the progress that we need to be positive about and build on each other's work.
D-Wave's reputation is one of exaggeration and putting marketing before science. That's how Scott ended up as de-facto "chief D-Wave skeptic": he kept complaining about their exaggerations and the exaggerations of journalists covering them.
(The paper being discussed now does actually seem conservative in its claims, but it's by a team at Google; not D-Wave.)
I'm wondering if this will allow for a huge speed up in 3D rendering? Specifically, unbiased Monte Carlo ray-tracing / path-traching renderers like Maxwell Render. Does anyone know of any work being done in this area of Quantum computing?
"Does anyone know of any work being done in this area of Quantum computing?"
Work in this area is still primarily focused on making it work at all. For instance, it isn't called out in the linked blog since by now Scott probably considers it basic background information, but D-Wave only solves a very particular problem, and it is both not entirely clear that it has a superior solution to that problem than a classical algorithm can obtain and it is not clear that encoding real problems into that problem will not end up costing you all of the gains itself. Really pragmatic applications are still a ways into the future. It's hard to imagine what they might be when we're still so early in the process, and still have no good idea what either the practical or theoretical limits are.
Quantum annealing could "potentially" provide a speedup for any optimization problem that can be reduced to a spin Ising form, particularly quadratic unconstrained binary optimization (QUBO).
Notice the emphasis on potentially, though. This paper only shows that 1) for a particular class of problems the quantum annealer has constant speedup over one current classical algorithms, and 2) the quantum annealer scales better for number partitioning than a few current classical algorithms.
The popular perception of quantum computers as "doing things in parallel" is very misleading. A quantum computer lets you perform computation on a superposed state while maintaining that superposition. But that only helps if the structure of the problem lets you somehow "cancel out" the incorrect results leaving you with the single correct one. You can't actually run a bunch of calculations in parallel and measure all of the results; it's more like you can, in some limited cases, make it as though you'd "magically" made the right choice to start with.
The phrasing of "factor-10^8 speedup" threw me for a bit. Is that a negative speedup, meaning it's significantly slower? No, it's actually "10^8 faster".
What's a thing explainer? But unfortunately since quantum computing is still really only an academic pursuit, you're only going to find technical literature or dumbed down to the point of not being worth it.
Saying that, there is one book I like called Quantum Processes Systems, and Information by Schumacher (who coined the term qubit) and Westmoreland since it is an introductory text and has a good discussion of the basics of quantum computing. It does, however, assume you are a physics student with a good understanding of linear algebra.
If you want to actually understand it (complete with exercises), Nielsen and Chuang (maybe go for a second-hand copy, the editions don't change that much and it's pretty expensive new). It's not ELI5 though.
"Finally, on the special Chimera instances with the tall, thin energy barriers, the authors find that the D-Wave 2X reaches the global optimum about 108 times faster than Quantum Monte Carlo running on a single-core classical computer. But, extremely interestingly, they also find that this speedup does not grow with problem size; instead it simply saturates at ~108. In other words, this is a constant-factor speedup rather than an asymptotic one."
[+] [-] calhoun137|10 years ago|reply
"...it has also been shown that for the class of problems used in initial benchmarks,the performance of the devices also correlates well with a classical(meanfield) approximation of quantum annealing [4],which raises questions about whether the devices can outperform classical optimizers."
In other words, if you ran a simulation of this device on a normal computer and gave it the same problem to solve, you will get basically the same efficiency.
[1] http://www.scottaaronson.com/troyer.pdf
[+] [-] BrainInAJar|10 years ago|reply
[+] [-] mtgx|10 years ago|reply
[+] [-] tormeh|10 years ago|reply
[+] [-] SilasX|10 years ago|reply
[+] [-] Fede_V|10 years ago|reply
He is the very rare case of a top notch scientist with a gift for explaining things well.
[+] [-] bhouston|10 years ago|reply
"As far as I’m concerned, this completely nails down the case for computationally-relevant collective quantum tunneling in the D-Wave machine"
I believe many of his previous blog posts had argued strongly that there was no proof of any quantum effects in D-Wave computers, but I could be misremembering.
[+] [-] shoyer|10 years ago|reply
On the other hand, it's still not clear that this results in computationally relevant speedups. In my mind, this is not much of a shift in the status quo.
[+] [-] dvanduzer|10 years ago|reply
So yes, your memory is accurate that his previous blog posts argue strongly that nothing D-Wave computers had done proved any quantum effects. His acknowledgement that they've achieved quantum computation is a compliment to their engineering prowess. But he is complimenting new engineering results.
[+] [-] lmm|10 years ago|reply
[+] [-] coldcode|10 years ago|reply
[+] [-] cjmoran|10 years ago|reply
[+] [-] mcguire|10 years ago|reply
That's the point; is the machine faster? If the machine is actually doing quantum computing, it won't slow down as much for larger problems (for some problems). But they haven't been able to prove that it's doing quantum magic.
[+] [-] unknown|10 years ago|reply
[deleted]
[+] [-] lmm|10 years ago|reply
[+] [-] unknown|10 years ago|reply
[deleted]
[+] [-] adrianbg|10 years ago|reply
[+] [-] raverbashing|10 years ago|reply
But that's the evolution of technology. First cars were not a significant advantage over horses
Sometimes it is much more efficient from the start (like transistors) but this is rare.
[+] [-] sytelus|10 years ago|reply
[+] [-] Strilanc|10 years ago|reply
Hahaha!
D-Wave's reputation is one of exaggeration and putting marketing before science. That's how Scott ended up as de-facto "chief D-Wave skeptic": he kept complaining about their exaggerations and the exaggerations of journalists covering them.
(The paper being discussed now does actually seem conservative in its claims, but it's by a team at Google; not D-Wave.)
[+] [-] touchofevil|10 years ago|reply
[+] [-] jerf|10 years ago|reply
Work in this area is still primarily focused on making it work at all. For instance, it isn't called out in the linked blog since by now Scott probably considers it basic background information, but D-Wave only solves a very particular problem, and it is both not entirely clear that it has a superior solution to that problem than a classical algorithm can obtain and it is not clear that encoding real problems into that problem will not end up costing you all of the gains itself. Really pragmatic applications are still a ways into the future. It's hard to imagine what they might be when we're still so early in the process, and still have no good idea what either the practical or theoretical limits are.
[+] [-] efangs|10 years ago|reply
Notice the emphasis on potentially, though. This paper only shows that 1) for a particular class of problems the quantum annealer has constant speedup over one current classical algorithms, and 2) the quantum annealer scales better for number partitioning than a few current classical algorithms.
[+] [-] lmm|10 years ago|reply
[+] [-] biot|10 years ago|reply
[+] [-] tzakrajs|10 years ago|reply
[+] [-] HappyTypist|10 years ago|reply
[+] [-] lovelearning|10 years ago|reply
[+] [-] gohrt|10 years ago|reply
also free online
[+] [-] cabinpark|10 years ago|reply
Saying that, there is one book I like called Quantum Processes Systems, and Information by Schumacher (who coined the term qubit) and Westmoreland since it is an introductory text and has a good discussion of the basics of quantum computing. It does, however, assume you are a physics student with a good understanding of linear algebra.
[+] [-] lmm|10 years ago|reply
[+] [-] arthurcolle|10 years ago|reply
"Finally, on the special Chimera instances with the tall, thin energy barriers, the authors find that the D-Wave 2X reaches the global optimum about 108 times faster than Quantum Monte Carlo running on a single-core classical computer. But, extremely interestingly, they also find that this speedup does not grow with problem size; instead it simply saturates at ~108. In other words, this is a constant-factor speedup rather than an asymptotic one."
[+] [-] selimthegrim|10 years ago|reply