top | item 26141498

Microsoft’s big win in quantum computing was an ‘error’ after all

161 points| kumarharsh | 5 years ago |wired.com

48 comments

order
[+] haltingproblem|5 years ago|reply
Locklin explains it best in "Quantum computing as a field is obvious bullshit" [1]:

"quantum computing” enthusiasts expect you to overlook the fact that they haven’t a clue as to how to build and manipulate quantum coherent forms of matter necessary to achieve quantum computation. A quantum computer capable of truly factoring the number 21 is missing in action. In fact, the factoring of the number 15 into 3 and 5 is a bit of a parlour trick, as they design the experiment while knowing the answer, thus leaving out the gates required if we didn’t know how to factor 15. The actual number of gates needed to factor a n-bit number is 72 * n^3; so for 15, it’s 4 bits, 4608 gates; not happening any time soon."

[1] https://scottlocklin.wordpress.com/2019/01/15/quantum-comput...

[+] nolok|5 years ago|reply
I don't entirely disagree but, isn't all revolution research like this ? You circle around an objective, throwing random things at it with no idea how to achieve it until someone has an eureka moment and you suddenly pass it. The bigger the objective, the bigger the wall to pass, the larger the eureka needed.

If we only funded or took seriously research where the researcher actually knew the solution or had a good idea how to reach it, we wouldn't have made that kind of a tech jump in the last century, let alone in the last 50 years.

[+] junippor|5 years ago|reply
> In 2010, I laid out an argument against quantum computing as a field based on the fact that no observable progress has taken place.

I'm a physicist by training and a quant by profession - from what I can gather, not unlike the author.

Without judging the merits of quantum computing I will just notice that arguments like "has no merit because it has made no progress" are backwards-looking. You're not assessing anything from first principles, you're just saying that nothing has happened _yet_.

In finance this reminds me of trend followers. Yes, you can make money by following the trend, but you're the last crowd to jump on the opportunity, right behind the guys who were looking ahead instead of back.

That said, such people also provide a nice balance against "we don't know everything therefore anything is possible" crackpots.

[+] mathgenius|5 years ago|reply
> they haven’t a clue

Excuse me? They have plenty of "clues". Have a look at this for some recent work from the google group:

https://scirate.com/arxiv/2102.06132

Will it lead to quantum computers in the future remains to be seen, but "haven't a clue" is just nonsense.

[+] Ar-Curunir|5 years ago|reply
That argument is quite bullshit, no? Just because something takes a long time doesn’t mean it won’t happen. We invented the first algorithms thousands of years ago, but only constructed working computers to execute those algorithms 100 years ago.
[+] mathgenius|5 years ago|reply
[+] ArtWomb|5 years ago|reply
I feel like you can replace "quantum computing" in this rant with almost any other major gov science initiative and it would still have relevance. Lunar colonization. Nuclear fusion. Genome therapy. All 5 years away since 2000. All require major materials breakthroughs. All hyped in the glossy popular media ;)
[+] alisonkisk|5 years ago|reply
I can scarcely imagine a worse concept to try to explain in tweet form.
[+] DebtDeflation|5 years ago|reply
That's a surprisingly good tweet thread that does an excellent job of pragmatically walking the line between "quantum computing is impossible" and "real quantum computers are only 2-3 years away".

Advances are being made, but we're still very much in the primitive science experiment stage.

[+] boublepop|5 years ago|reply
Seems a lot like a deliberate lie in order to gain funding. While Microsoft backing this means they have solid financials, they are also pulling in serious amounts of funding from state actors. To the point where in for instance Denmark they where described as one year “draining the entire state innovation fund”. Doing fund raising based on articles that where knowingly manipulated to support untruthful claims really’s should be treated like financial fraud. Though likely this will come out with just a reprimanded and no real consequences.
[+] jacquesm|5 years ago|reply
It may be a loss for Microsoft but it is a win for the scientific method. And it is also one more score for the push to include all data with scientific papers and not just the pretty version.
[+] klyrs|5 years ago|reply
The authors say that they cut the data out for "aesthetic" reasons. Personally, I don't buy it. They claimed a groundbreaking result, that obviously doesn't hold for the "ugly" data. It's possible that they deluded themselves, but from where I sit, that just counts as two more victims of fraud.
[+] haltingproblem|5 years ago|reply
I would really like to understand how committing outright fraud by omitting contra data for "aesthetic" reasons and then getting caught is a win for the scientific method.

By this logic what would be a loss for the scientific method? Committing fraud and never getting caught?

[+] ThePhysicist|5 years ago|reply
Oh wow, looking at their graph and the portion they cut out for "aesthetic reasons" this looks almost like fraud to me. It's at least hard to imagine how Kouwenhoven or anyone in his group could not have found this highly problematic. I did my PhD in a similar field (superconducting quantum computing) and if I had cut data from a graph like that I would've gotten a really strong reprimand from my supervisors. If you do a series of IV curves at equidistant points you cannot simply cut out the data you don't like, and if you do remove irrelevant data points (again, you probably shouldn't) you need to replace them with a placeholder value such as a black background to make it immediately clear that you left out some data. Just cropping your graph (which also makes most labels on the X axis invalid) is just insane, they hammer that into your head during the first undergrad lab classes already. I'm sure if they had marked the missing curves with a placeholder the reviewers would've noticed immediately and this article would probably not have been published. So from an outside perspective this doesn't look good at all for the authors.

Well, I guess this shows why it's so important to publish the raw data as supplementary material along with the article.

[+] c1ccccc1|5 years ago|reply
I remember that when I was working as an undergrad research assistant in a physics lab, a PhD student and I had plotted some data, to see if the relationship between 2 variables was linear. We had cut off many of the data points to the left and right, keeping only a small region in the middle. When we showed it to our supervisor, he came back with "of course it looks linear if you only look at a small interval, that's just Taylor expansion." He then explained to us that it's incredibly scientifically sketchy to leave out perfectly good data from one's plots. Not fooling yourself is surprisingly hard when you're wishing for a specific outcome.

On a related note to the supplementary data thing, I've recently seen a lot of people complaining online about not being able to reproduce the results of machine learning papers that don't have the code for their models included in the supplement. "Code, or it didn't happen." I guess each field has its own problems with replication of results.

[+] sn_master|5 years ago|reply
Anyone else wondering how much bonus and promotions were obtained in Microsoft as a result of that "research"?
[+] klyrs|5 years ago|reply
Personally, I think I'd enjoy the fame of being the first to prove the existence of Majorana fermions much more than whatever promotions / bonuses etc may have been on the table. One might expect a Nobel prize for such a result. This may be cynical, but my guess is that they wanted to get their stake in the ground first, and optimistically saw the misbehaving data as a hardware glitch that would get ironed out before they received any real scrutiny.
[+] mathattack|5 years ago|reply
Did we really think the folks who take 3 versions to get something correct would get Quantum Computing right on the first try?

Saying this is the death of QC is like saying Windows 1 means the GUI is a worthless concept.

[+] contextfree|5 years ago|reply
scientific research isn't the same as commercial product development.
[+] jacquesm|5 years ago|reply
Are you suggesting they'll take another two tries to get a working quantum computer?
[+] DonHopkins|5 years ago|reply
"The authors later told us it was done for aesthetics."

I sure hope the scientists developing the COVID-19 vaccines didn't cut out any data points "for aesthetics".

[+] skocznymroczny|5 years ago|reply
Even if they did, it's unlikely we will find out in the current political climate. We have to wait until 'end of pandemic' before people will rationally analyze the data.
[+] poletopole|5 years ago|reply
Does anyone know if this is the same quantum computing project as their anyon quantum braiding research?
[+] caycep|5 years ago|reply
How can they tell?

...sorry...quantum joke

[+] Twisell|5 years ago|reply
Is this the first sign that we have nearly reached the end of "quantum computing" hype cycle?

Meanwhile teh M1 is doing "boring optimization of binary computing"...