It's insurance money. If you're a manager of a big company like IBM, Microsoft or Google, you have to align your current product portfolio and future portfolio in such a way that shows your investor that your company will keep growing, even if your current products are stagnant.
You can surely say Quantum computing won't do much in next 5 years. But what about 10 years? 20 years? 30 years? The farther you look into the future, the bigger the probability of having a huge tech breakthrough that could give the company who has it a massive edge on the market.
Even if you have a chance of 1% of having a sort of transistor revolution from QC, it becomes a race to the bottom. If Google starts researching it, IBM will follow suit, and so will Microsoft. If in 30 years this turns out to be a big deal, no one will be 30 years behind.
I think you are describing the company dynamics accurately, but I can't help think this is just a terrible way to invest. No party has a concrete plan or vision for how to use it, they just throw money because there is a consensus of good feeling around it. Those good feelings were probably created through academic or corporate marketing efforts in the first place.
I mean these companies have Research and Development divisions. I was at IBM at the turn of the century when they spent 6 billion dollars on research. One of the pushes was to get research to focus on things they could market and make money with.
But the big research plays (bell labs, xerox parc) seem to get less and less funding if they exist at all. A lot of the inventions of those places were monitized outside those companies. IBM had a chip fab in the research building… long spun of was that business.
At the turn of the century IBM was researching quantum computing, but as I was leaving selling services was IBMs big push.
Supposedly one of the really big, important things could do with a quantum computer (QC) is quickly solve to optimality instances of NP-complete optimization problems, e.g., problems in scheduling, resource allocation, logistics, etc. which can be formulated as linear programming problems (that need just knowledge of linear equations) where want all the variables to have whole number values, that is integer linear programming (ILP).
Okay, integer linear programming problems .... To get all excited about quantum computing (QC), need to get excited by the big money to be saved by solving all those important, practical ILP problems.
Okay, I had a good background in pure/applied math and in computing and got into ILP for scheduling the fleet at FedEx. Since the promised stock was 1+ years late, I ran off and got a Ph.D., in one of the best programs, in more in hopefully useful pure/applied math, and much of that work was in ILP.
Here is some blunt truth about the NP-complete problems and the cartoon at the beginning of the famous book by Garey and Johnson: The math guys were talking to their manager explaining that they couldn't solve the manager's problem but neither could some long line of other math guys.
Here the blunt part is the meaning of "solve" -- with a computer program running in time only a polynomial in the size of the problem get an optimal solution to any instance of the problem including the worst cases. And here optimal means down to the last penny to be saved. So, for some network deployment by AT&T that was to cost $1 billion, save down to the last penny, in polynomial time, including for the worst case instance of the problem.
Yup, maybe the savings would be $51,937,228.21. And do want to save that last penny. But if the manager would settle for saving just the first $51,900,000.00 in reasonable computer time for all or nearly all the actual instances of the manager's real problem, then there would be little or no difficulty. And should be able to tell the manager that savings of more than $55 million, or some such, were impossible -- that is, have an upper bound.
So, much of the difficulty was saving the last $37,228.21, guaranteeing to do so, for all instances of the problem, including the worst cases.
Well, I can assure readers that should I have insisted on a career saving, e.g., $51,900,000.00 where savings of $55 million were impossible, then I would have spent the last several decades homeless on the streets or dead from homeless on the streets -- no joke.
Bluntly, there just is no significant demand for solving ILP problems in practice. The "managers" don't want to get involved.
Selling pizzas from the back of a truck? Sure -- might sell 100 pizzas a day. Selling solutions to ILP and other NP-complete problems -- f'get about it.
Uh, since there is no significant demand for saving $51,900,000.00 with a bound of $55 million, there stands to be not significantly more demand for saving $51,937,228.21.
Thus, there stands to be no significant value for QC for solving NP-complete ILP problems. Sorry 'bout that. If some people want to get the $51,900,000.00 savings, they've been able to do that for decades and have voted loud and clear "We don't care.".
E.g., in one of my attempts, a guy sent me an ILP problem, we talked, and two weeks later I had running code that in 900 seconds on a slow computer got a feasible solution guaranteed to be within 0.025% of optimality. The problem had 600,000 variables and 40,000 constraints. I had done the work for free. Still, then, suddenly he was not interested.
So be it.
There was another one: I was writing the code using the idea of a strongly feasible basis, and suddenly the customer was not interested and returned to some not very good heuristic code he had.
Better, a lot better, to sell something a lot of people actually want, e.g., a lot better to sell pizza.
And I am doing a startup that to me continues to look good, software running, but it has nothing to do with NP-complete or ILP and wouldn't be helped by QC.
So, to me, e.g., even if Google gets a good QC that can solve ILP problems, then I don't believe that they will have many customers or much of a business and there will be no big reason for IBM or Microsoft to worry.
Since there is no significant demand for using ILP to save money now, I don't see a significant demand for using QC on ILP to save money in the future.
Their employees might be better off selling pizzas. Let's see: From some of my arithmetic about costs of pizza, can do well for $2-3 a pizza. From a pizza truck in a good location might be able to sell the pizzas for an average of $10 each, e.g., an extra $1 for anchovies! Might sell 100 pizzas a day for $1000 a day, maybe 20 days a month. Looks like a better career than QC research!
If there is no demand for pizzas, then there won't be much demand for pizzas with anchovies.
Uh, the Google QC researchers are well paid? Terrific -- park the pizza truck near the Google QC research building!!!!
For some parts of US national security, the situation for a good QC might be significantly different -- I doubt it, but maybe.
Negative, positive outlook is that it is a disinformation campaign so one may maintain the lead in a particular trajectory of technical dominance. Whilst doing so, as an extra game theoretic safety precaution which also amplifies the disinformation campaign is to fund any research in the direction of the disinformation campaign as both a distraction and 'impossibility canary.'
I don't understand this argument at all. Of course it isn't making money yet-- that's because it's an early technology that is still being researched. Sure it might never mature, but it seems crazy to call it a "bubble" or to analyze it based on current sales figures.
That wasn't the argument made in the article. There are reasons to believe that the technology is fundamentally unsound, and will never be able to scale or make money.
It's not actually early technology, it's been developed since the 80's. And if the underlying theories are unsound - if it doesn't even work in theory - then putting more money in won't make it magically viable.
If the output side is saturated with work load but the input side keeps growing to blow up without significantly changing the output, or at worst affecting it negatively (quantum blockchain buzzword bingo), it may be fair to speak of a bubble.
I think his model of the situation is short-sighted, to say nothing of the callbacks to that management principle involving transistors.
If you're thinking that the whole purpose of QC will be quickly subsumed by wide algorithms with superpolynomial speedup, you might be missing the point. It's about how computers are built, not about stuffing one specific abstraction into another. If suddenly we discover we can build a machine that can generate random numbers a quadrillion times faster than any current hardware design, that's a new space in computation.
I mean consider how widely deployed the parallelism construct is now, and that Amdahl's law was elucidated in the 60's.
Parallelism was just one degree of freedom for us to climb the S-curve on, quantum computing seems to provide essentially a continuum of them.
I think most people, including the author, would agree QC should be funded for fundamental research reasons. But that is clearly not the way it is being pitched to VC. Right now there is no clear use-case, that's what I felt he was warning against. If nothing materialises soon, he's probably correct to say this is a bubble.
I have a fair amount of experience in this space. It’s like, at a vacuum tube era, at best. There is a definite opportunity for advancement, but it is still extremely early.
We are building user interfaces that make it easier to “play around” with quantum computing phenomena—especially with music and art—with the idea that our aesthetic sensibilities may help drive discovery.
This is something of a low-effort article, with a short-sighted focus on immediate profitability. There are many scientific programs that didn't really become private-free-market revenue generators for decades at least (the US space program, for example).
An article with a little more depth might examine the future of trapped-ion quantum computing, for example:
As far as the 'make money off new drugs' mentality, that's not really where QM chemical simulations in molecular dynamics really seems all that promising - it's more about things like the design of new catalysts to improve the efficiency of various industrial processes.
If QM computation is eventually developed, the devices will almost certainly be large and extremely expensive (kind of like the cutting-edge chip fab machines of today in scale). For most businesses, it's unlikely the benefit of owning one will justify the cost, so it'll probably be a national lab / research center type thing.
The key consideration with investments is ROI. When the investor is a government; it can afford to take the long perspective. For most companies; and institutional investors, this works less well.
The key mechanism to protect inventions is patents. Patents have a limited shelf life. If you file a lot of patents today and it takes 30 years before you can apply them, they will have expired by then and others are free to take your inventions and build on that. So, if quantum computing requires another three decades to start making money, most of the companies that are currently being invested in will have failed and their patent portfolios and investment will be worthless. Their patents will have expired, their founding scientists will have moved on or retired, etc. At best those companies may be in a position to file more patents. So, any investors investing right now are making bets on how long it will take before there's a meaningful market to get an ROI and which companies are positioned best to take a chunk out of that market. The further that is out, the higher the risk of losing their investment.
There are billions flowing into quantum computing and the article is simply making the point that in terms of revenue potential there seems to be a lot of uncertainty about the practicality of current approaches, the lack of any real revenue (beyond consulting people on how awesome it would be if we had working quantum computing, etc.). And the lack of perspective on when all this will change. Very valid points. There are a few big companies investing in this stuff but none of them is betting their company on it. It's a side show at MS, Google, IBM, etc.
A long shot that might create some viable business decades further from now but if it all fails, their stocks will be fine. There's enough substance there for them to want to have a finger in the pie if it does take off but none of these companies seems to be counting on that happening any time soon.
There’s also billions going into commercial fusion reactors, which haven’t turned net positive yet. The goal of the investment is to build that capability tho, same (I think?) as with quantum computing. Weird critique imo.
Thing is fusion is known to be possible - the sun and hydrogen bombs. Quantum Computing lacks equivalent existence proofs. There's a lot of abstract theory but no real indication it is possible to scale up in the real physical world to useful problem sizes and reasons to doubt that it is possible.
As the tech is still unproven, it's research, rather than building capability, that they're spending money on. I hope all these investors understand this...
The physics underpinning fusion was proven in the 1950's. Since then it's been an engineering problem.
The physics underpinning qc was arguably proven in the 2020's. It's not quite done (in the way that fusion was not quite done in the 50's) but there is a fairly clear set of demonstrations that QC's with error correction are possible. However the engineering barriers are fierce and there is still a possibility that they are insurmountable. In addition there are concerns that while QC will work the class of problems that is NP and also BQP may be very small. Even if a problem is in that group then it may be that the algorithms we have are not superquadratic or quadratic - meaning that the improvement that they offer over classical algorithms may be marginal.
Worse, there are often very good heuristic approaches to some of these problems which means that although a superquadratic QC approach would be an amazing breakthrough of computer science (genuinely amazing and worthy of accolades and prizes and fundamentally important for our understanding of the universe etc) it would offer only marginal economic value (possibly). Now, this is not true of some problems where there are exponential explosions and no good heuristics... but there is an even worse catch.. Which is that the quantum algorithms offer computer scientists fresh insight into what's tripping up the classical approaches. In this scenario it can be that an amazing breakthrough happens in QC, and someone uses that to get an insight that pushes the classical approach close enough to the QC approach as to render the QC approach marginal.
The theoretical picture is moving very fast though - so we will have to see.
On the other hand the practical side is moving more slowly. We see announcements that make one think that a Moore's law type of scaling is happening but hidden in the small print there are often (always as far I can decode) catches that mean that while the results look great they are still very much mired in problems. For example, are all the bits on a QC useable at once? Can they be used to form an actual algorithm? How long does the machine run for? How long does it take to start? Some of the answers are jarring - often only a small subset of a machine can be used in an actual problem solving episode; sometimes the machines run for a few steps only; sometimes the machines take 24hrs or longer to start.
It has taken 70 years to nearly build fusion reactors, it took 70 years to create mRNA vaccines. It may well take 70 years (from now) to build practical, valuable quantum computers. And something could go wrong on that path that just renders them moot.
One point of interest was the paper on quantum computing applied to quantum chemistry[1]. In that paper, they did not find generic exponential speedup for a list of chemistry problems with current quantum algorithms. There are 3 problems with this: a speedup does not need to be exponential in order to be incredibly valuable; a speedup does not need to be extremely generic, just enough to cover real-world use cases; and quantum algorithms are still in their infancy, and it's unclear how much more we might discover in the next 10-20 years.
Furthermore, the paper itself links to a github repository[2] with a list of papers that either imply or use an exponential advantage in quantum chemistry. Now would be a good time to mention that I am not an expert in chemistry, nor have I read the entirety of this list of papers so I am not in a position to go through each and every one to decide how generic their results are or what the limitations are. Perhaps all these papers have fundamental limitations that prevent it from being useful in normal chemistry, only in weird souped-up problems specifically devised for a quantum advantage.
Either way, this paper is by no means conclusive on the subject. There's a ton of more research to be done in multiple fields to know for sure.
The reason exponential speedups are required is due to the extreme cost of quantum computing R&D and extremely limited quantum computers that come out of it.
I can provision 1k CPU based servers or ~20 4x GPU based servers in a cloud computing environment for an hour for <$400. These are mature technologies with massive economies of scale behind them. A quantum computer needs to not only outperform scale out GPU/CPU performance on a particular problem set, it needs to crush it.
Note that he only recently finished his PhD in a tangential area related to Quantum computing, so he is not a giant in a field like say Aronson who I would be interested in knowing what they think on this industry.
I do in fact agree with most of the article. However I also think if you measure ratio of scientific/technological impact to funding, I would place Quantum Computing for higher than many other hype bubbles such as crypto, blockchain, web3.
In other words the size of funding QC is getting is nowhere close to the other hype bubbles and there are some significant peer-reviewed results that have been generated from it, so for the time being you can still give it the benefit of the doubt.
For example it has definitely enhanced our understanding of quantum chemistry and computational complexity, and anyone who invests time learning QC will end up having solid new insight about how the world works and deep engineering knowledge of electronics, which you can't say about many other bubbles.
For example, compare how many QC startups YC has funded (I think 0?) compared to blockchain, crypto, AI-assisted medicine and web3. There is no comparison. Picking on QC is far below my list if you want to have a go at hype bubbles.
>That means these firms are collecting orders of magnitude more in financing than they're able to earn in actual revenue — a growing bubble that could eventually burst.
>"The little revenue they generate mostly comes from consulting missions aimed at teaching other companies about 'how quantum computers will help their business,'"
All bubbles are not equal in risk, folly, or long term sustainability.
In the case of the Internet Bubble stocks were down 78%, but it was not hard to do well in the end given diversification and a long enough horizon.
In the case of the Dutch Tulip bubble there was no good ending for anyone except those who got out early.
Some bubbles like NFTs generate strong opinions but have yet to have final judgment from history.
I think the quantum computing bubble is different than all three, but closer to the Internet than to Tulips. In which case the conventional strategy would be to diversify and expect a long time horizon.
> In the case of the Internet Bubble stocks were down 78%, but it was not hard to do well in the end given diversification and a long enough horizon.
This is untrue. If you were invested in what became the stars of that era: Amazon, Red Hat, Cisco, a few others, you eventually made decent money, although far worse than if you had stayed out and bought the dip.
If you had a diversified portfolio of 'new economy' stocks which didn't include a few winners like this, you might have lost over 95% of your money and never got it back. Lots and lots of stocks simply disappeared or were bought for peanuts. Many others, including lots of very very highly rated ones like Yahoo never exceeded their bubble-era peaks.
I'll get downvoted into oblivion for this, but literally most of my old academic friends are in on this grift, old advisors own some of the largest enterprises in this 'sector', and all all these people (in private) regard quantum computing as a money grab. At best as a way to fund research.
And that's it! The author of this article is 100% right. Markets are fully aware though, go ahead and try to short any publicly traded QC stock lol. You can't. There's no shares to borrow and no liquidity on puts....
This article is kind of crap. I kind of expected better from ft.
-----
The reality is that none of these companies — or any other quantum computing firm, for that matter — are actually earning any real money.
-----
ORLY? I guess I should go masssively short IBM shares then. https://newsroom.ibm.com/image/2022%20IBM%20Quantum%20Roadma...
----
Shor’s algorithm has been a godsend to the quantum industry, leading to untold amounts of funding from government security agencies all over the world. However, the commonly forgotten caveat here is that there are many alternative cryptographic schemes that are not vulnerable to quantum computers. It would be far from impossible to simply replace these vulnerable schemes with so-called “quantum-secure” ones.
----
ORLY?
New cryptography can take 20 years or more to be fully deployed to all National Security Systems. NSS equipment is often used for decades after deployment. National security information intelligence value varies depending on classification, sensitivity, and subject, but it can require protection for many decades. -NSA
The solutions we do have do not work very well. Only the weakest FALCON-512 (bad name as it was only 64 bits of quantum security, now the dual lattice attack seems to reduce this to 20?), actually fits the TLS use case without breaking the internet. The signatures are just too big. Cloudflare has testing that proves this.
If that wasn't enough, this person is completely unaware of the annual survey of quantum researches that actually puts the arrivial of a cryptanalyically relevant quantum computer at 2030 or so. Peter Shor is actually one of the people polled in the survey, this person is not. And if you parts are still clean, you can look at the surveys estimates since 2018. These estimates are clearly trending towards sooner and sooner, instead of further and futher away.
According the OP, the quantum computers that are feasible today have no real-world use, and likely won't have any real-world uses in the near term, despite claims to the contrary by executives and salespeople. Quantum computing research, the OP asserts, is an academic pursuit, not a commercial one, but it is funded by investors who think it's the latter. The entire piece can be summarized as "Look! The emperor has no clothes!"
Quoting:
> Billions of dollars have poured into the field in recent years, culminating with the public market debuts of prominent quantum computing companies like IonQ, Rigetti and D-Wave ... These three jointly still have a market capitalisation of $3bn, but combined expected sales of about $32mn this year (and about $150mn of net losses), according to Refinitiv.
> The reality is that none of these companies — or any other quantum computing firm, for that matter — are actually earning any real money. The little revenue they generate mostly comes from consulting missions aimed at teaching other companies about "how quantum computers will help their business", as opposed to genuinely harnessing any advantages that quantum computers have over classical computers.
How did this come about? I think it's a very simple case of misunderstanding and overpromising. There are many interdisciplinary fields that mesh with computation. We have DNA computers and neural turing machines. QC is a subfield of quantum mechanics, one of many with some interesting applications but nothing shows that it has open-ended potential to revolutionize computation. But, it has the word 'computation' in it , and in the past decades the VCs with most money come from computer science. So you have a combination of Quantum (spoooky, mysterious) with computation (that one i know). I never got why QC was seen as so promising, it's an interesting exercise on paper but is not the 2nd coming of anything. Wish that money had gone to fusion instead, that one we understand now more than ever, that it has very real positive consequences
> nothing shows that it has open-ended potential to revolutionize computation
Well, we do know that P <= BQP <= PSCPACE, and we have one important example that lies in BQP but not in P (for all we know). It's just not clear how important that particular example is for the kind of computing we do today, if it ever becomes practical. It looks like it'd rather result in a one-time nuisance for sysadmins, like Y2K was.
The hope was for applications in new areas like materials and drug design. The author has posted a link to one paper suggesting that we might not see exponential speedups in chemical simulations, but that's not an outright refutation either.
It was definitely sold as the next step in computation.
A real performant quantum computer could potentially revolutionize a lot of industries. But selling it as an accelerator of molecular dynamic simulations is not quite as sexy.
Well hopefully the quantum industry has fundraised enough to last them for a while . It isn't the exact picture, but minus potential shareholder (edit:lawsuits) for tanking stocks these are exactly the sorts of startups that should be comfortable with taking 70%+ haircuts to their valuation.
The observation that important technologies required a long time from inception to practical use is common here. That is true but ignores the fact that there were a tremendous number of possible technologies available that could of eventually worked out. Only a very small number ever did.
its not exactly a secret we are very far away from useful quantum computers. Every comment i've ever seen from people in that industry, except those in a financial position to benefit, have said so.
Frankly, the article reads as if the author has an axe to grind.
On utility, there's more than just Shor's: unstructured search [1], finance ([2], [3]). Even if quantum computers ultimately prove unfruitful commercially, that doesn't render it a useless endeavor. Like String Theory, it can beget findings in other areas, regardless of whether you can profit from them: novel classical recommendation algorithms ([4]), quantum algorithms for SAT that could possibly help automated theorem proving ([5]).
Part of the difficulty of quantum computing is that to show speedup, you need to find complexity bounds on classical problems whose runtime is actively being researched, e.g. neural networks ([6]).
As for their financial worthwhileness, while there is valid concern ([7], [8]), it's far too early to tell: it's hardware, not software. Also, it's my understanding that private investment is much larger than public funding in the US for quantum computing, both of which pale in comparison to China's investment. Thus, I wouldn't want to see investors shy away if the government is unwilling to make up the difference!
Commenting on this, now that it's making its third or whatever pass on HN. Also Scott Aaronson's comments are interesting: https://scottaaronson.blog/?p=6670
> The most prominent application by far is the Shor algorithm (opens a new window)for factorising large numbers into their constituent primes, which is exponentially faster than any known corresponding scheme running on a classical computer. Since most cryptography currently used to protect our internet traffic are based on the assumed hardness of the prime factorisation problem, the sudden appearance of an actually functional quantum computer capable of running Shor’s algorithm would indeed pose a major security risk.
> Shor’s algorithm has been a godsend to the quantum industry, leading to untold amounts of funding from government security agencies all over the world. However, the commonly forgotten caveat here is that there are many alternative cryptographic schemes that are not vulnerable to quantum computers. It would be far from impossible to simply replace these vulnerable schemes with so-called “quantum-secure” ones.
Note that Shor's algorithm breaks not just factoring, but also discrete log, including elliptic curve discrete log. That includes classic DH and DSA of course, as well as ECDSA and ECDH, whether they're over Bitcoin's curve, the other NIST curves, Brainpool, {curve,ed}{25519,448}, pairing-friendly curves, everything. Almost all broadly deployed public-key crypto uses RSA or elliptic curves. Those alternative public-key algorithms are still being worked out, and will take years to broadly deploy, so if a QC gets built, it will probably be able to break into straggling systems for some years. There is also a risk that the replacements will eventually fall to quantum or even classical attack, especially considering that a significant fraction of the proposed replacements already have fallen (most recently SIKE) or been weakened (eg, every multivariate quadratic sig). They may also have other security problems, eg implementation bugs or side-channel attacks.
The surviving quantum-secure algorithms are all either pretty inefficient (McEliece and SPHINCS+, and CSIDH and SQISign but those are also bleeding-edge), or use structured lattices (Kyber, Falcon, Dilithium, NTRU and NTRU prime, etc) or structured codes that look kind of like structured lattices (BIKE, HQC). So we'll have most of our eggs in just a couple of baskets again, and outside of applications that can use McEliece and SPHINCS+, they'll be newer, less-tested baskets. Also, while fast, the structured lattice and structured code systems still use significantly more bandwidth that elliptic curves.
Using long-term symmetric keys instead of or in addition to public-key crypto is possible in some applications, but it's obnoxious and limiting: you'd end up with some combination of Kerberos derivatives (with trusted third parties acting as single points of security failure), mailed smartcards or other secrets, and physical in-person meetings to set up shared keys.
So the bigger issue in my view is that outside of Bitcoin, breaking crypto is mostly a net negative for society. Transitioning to quantum-secure crypto is also a negative, in that it will take a ton of work and the replacements are less efficient than elliptic curves, and may have security problems. (It's also probably unavoidable because governments will try to build QCs to break crypto even if private industry doesn't.) So all this money is being spent on something whose first major application will be negative, if it even works at all. Hopefully the positive stuff will outweigh this.
Sometimes I wonder if we should try building a news site optimised for seeing the effect of appeals to authority.
To write an article for the site, we would need to:
1. Write a headline with no mentions of any experts.
2. Write another headline mentioning at least one expert in it.
3. Write the content without mentioning any experts.
4. Write the content and sprinkle names of experts as needed.
5. Publish.
Now, the reader would then:
1. Be exposed to the no-experts version of the article - both headline and content.
2. Once finished, the reader will be prompted to write their thoughts on the article.
3. Click “Reveal”.
4. The reader would then skim or read the whole article again, but this time it would mention the experts.
5. Prompt the reader to evaluate how their thoughts had changed after reading the expert version of the article.
I’m so gullible, seeing experts in anything especially when names of prestigious institutions or titles are tacked onto them, tend to shut down the reasoning part of my brain altogether.
Bear in mind, the site I proposed is not a place to police how articles should be written; rather, it’s all about increasing its readers’ awareness on how much mentions of an authority can impact their initial reasoning and judgement and sometimes make them stop reasoning at all. My view is that mentions of an authority are useful for calibrating our judgements after we tried to reason on our own but not before that.
And yeah, I have no opinion on the original post. Just like to go off on a tangent once in a while.
There is value in knowing what experts say. I and 99% of people in IT will never be in position to evaluate if quantum computing is feasible or not. The only thing we can do is try to evaluate expert's credibility and pick a side so to speak.
The same goes for health advice but this time it's 99.9%+ - if you're not in the field you can just listen and hope you are good at estimating who is more credible or likely to do better research or more truthful claims. Trying to evaluate them yourself is a recipe for being wrong and creating your own bubble.
If the article says: random guy X says quantum computing is a scam because Y there is nothing I can take away from it because it's very easy to make Y both incorrect and plausible sounding to me. If I know it's Oxford physics professor who makes the claim I can learn that Y is at least serious enough reason to not be easily dismissed.
Appeal to authority is bad as an argument when people knowledgeable in the field try to debate a certain point. In other cases it's very useful to know who makes the claims and very often it's the only thing that gives the claims credibility.
I've always wondered if the format of long form expert opinions could be replaced by a knowledge graph that is independent of the expert.
E.g. instead of article "Economist John rejects minimum wage"
Root node "Minimum wage is not the best solution to problem x" -> because -> <node to define problem>, <node to define alternative solutions> -> because -> <leaf nodes of studies or models>
In this way other experts could add to the graph and the differences between different branches of argument could be more easily compared or automatically updated. Articles could still be written, but could reference specific nodes or edges of the graph which adds clarity to the discussion.
Well, I don’t know, I do think that expert opinions can carry weight. Like: “according to Dr Malcolm Alan of the department of geology at Harvard university, this kind of rock usually indicates…”. I mean knowing that a professor of geology said that is evidence that a fact is true, isn’t it?
Having done that, you could then write a paper on your findings. You would then be an expert! You could then rerun the experiment using your paper as the story in order to validate your initial findings…
> at some point the claims will be found out and the funding will dry up.
Someone hasn't been watching the cryptocurrency markets.
That's partly tongue in cheek. But there are countless examples of the market remaining irrational longer than one can stay solvent.
Witness the continued success of BTC and Ether, amid newer options that outperform them on every tech-related metric, often by many orders of magnitude. I conclude that marketing hype and the first mover advantage form the vast bulk of valuation in a novel tech that people don't understand.
This is not to take away from the author's point at all - I would hope that anyone who invests in quantum computing reads the criticism from an insider who can actually read the papers.
However, as irrational and harmful as it is, I don't expect BTC to drop to zero before the day quantum computing actually does follow through. Rationality really isn't our thing.
>> I conclude that marketing hype and the first mover advantage form the vast bulk of valuation in a novel tech that people don't understand.
Well said. Having worked in the blockchain space as a developer and founder since 2017, I've also come to the same conclusion. The formula for success is hype + first mover advantage - Aside from that; it's all about social climbing and politics around those projects.
It's surprising how long the first-mover advantage advantage lasts and it's weird to see that even developers who should know better are getting pulled into learning poorly designed (or outdated) technologies. They're conflating the financial achievements of projects with their technological achievements.
I guess that's what happens when big investors are laser-focused on making as much money as possible instead of also trying to drive innovation forward.
IMO, the inability to separate the two is a major reason why we have such significant financial bubbles in the tech sector.
Crypto is a bad example because it's pretty pointless in general. You can have better, faster, more efficient, safer, easier to use tech but at the end of day you're only achieving the same thing: skipping over regulations that apply to traditional currencies but somehow don't to crypto. That's assuming good will as one can argue that creating a distributed Ponzi scheme is another goal. It turns out that to achieve those you can get away with a simple idea and implementation like BTC.
It's really not a surprise it's not about tech when the goals and challenges were never technical.
On the other hand in areas where it is about tech we see superior one winning over established players all the time. Google, WhatsApp, AMD just to name three out of many examples in various times of this millennium.
We changed the url from https://futurism.com/the-byte/oxford-physicist-unloads-quant..., which points to this. We wouldn't do that if the site was hardwalled, but given that there are workarounds posted in this thread, it's more important to have the original source.
The article must be written someone who is not an expert in this field, or an expert who is suppressing information to disinform with his conclusion.
I say this because:
1. They say nothing about the breakthroughs in quantum error correction that is allowing IBM to promise a leap from 89 qubits today to 4,000 qubits in 2025 (still not enough on its own for a cryptographically relevant quantum computer - CRQC0, running Shor's algorithm for exponential speedup in breaking e.g. RSA 2048, which some research suggests would take 20M qubits including those for quantum error correction)
2. He did not mention Grover's algorithm which provides quadratic speedup (for time complexity of searching for a particular string in an unsorted list of N items) over their classical counterparts. However, even quadratic speedup is considerable when N is large.
3. He did not mention the breakthrough by University of Chicago researchers that showed multiple quantum computers can be entangled over tuned optical fibers to act as a single quantum computer. This still doesn't mean that we can go from 4,000 qubits to 20M by networking 5,000 of the quantum computers IBM promised for 2025, in 2025, but it provides a trajectory for networked quantum computing as a horizontal scaling strategy.
4. He did not mention the $100B allocated this year by the Whitehouse/Congress for CRQC research.
What is his motive in giving us such an incomplete story with such a skewed conclusion? Is he working for a hedge fund that is shorting some stock? Or is he just a lay person trying to sound intelligent by writing about a field where they're not sufficiently informed?
kalimanzaro|3 years ago
https://archive.ph/0VB0K
Valuation over value.
cerol|3 years ago
It's insurance money. If you're a manager of a big company like IBM, Microsoft or Google, you have to align your current product portfolio and future portfolio in such a way that shows your investor that your company will keep growing, even if your current products are stagnant.
You can surely say Quantum computing won't do much in next 5 years. But what about 10 years? 20 years? 30 years? The farther you look into the future, the bigger the probability of having a huge tech breakthrough that could give the company who has it a massive edge on the market.
Even if you have a chance of 1% of having a sort of transistor revolution from QC, it becomes a race to the bottom. If Google starts researching it, IBM will follow suit, and so will Microsoft. If in 30 years this turns out to be a big deal, no one will be 30 years behind.
bottled_poe|3 years ago
sicp-enjoyer|3 years ago
acomjean|3 years ago
But the big research plays (bell labs, xerox parc) seem to get less and less funding if they exist at all. A lot of the inventions of those places were monitized outside those companies. IBM had a chip fab in the research building… long spun of was that business.
At the turn of the century IBM was researching quantum computing, but as I was leaving selling services was IBMs big push.
graycat|3 years ago
Okay, integer linear programming problems .... To get all excited about quantum computing (QC), need to get excited by the big money to be saved by solving all those important, practical ILP problems.
Okay, I had a good background in pure/applied math and in computing and got into ILP for scheduling the fleet at FedEx. Since the promised stock was 1+ years late, I ran off and got a Ph.D., in one of the best programs, in more in hopefully useful pure/applied math, and much of that work was in ILP.
Here is some blunt truth about the NP-complete problems and the cartoon at the beginning of the famous book by Garey and Johnson: The math guys were talking to their manager explaining that they couldn't solve the manager's problem but neither could some long line of other math guys.
Here the blunt part is the meaning of "solve" -- with a computer program running in time only a polynomial in the size of the problem get an optimal solution to any instance of the problem including the worst cases. And here optimal means down to the last penny to be saved. So, for some network deployment by AT&T that was to cost $1 billion, save down to the last penny, in polynomial time, including for the worst case instance of the problem.
Yup, maybe the savings would be $51,937,228.21. And do want to save that last penny. But if the manager would settle for saving just the first $51,900,000.00 in reasonable computer time for all or nearly all the actual instances of the manager's real problem, then there would be little or no difficulty. And should be able to tell the manager that savings of more than $55 million, or some such, were impossible -- that is, have an upper bound.
So, much of the difficulty was saving the last $37,228.21, guaranteeing to do so, for all instances of the problem, including the worst cases.
Well, I can assure readers that should I have insisted on a career saving, e.g., $51,900,000.00 where savings of $55 million were impossible, then I would have spent the last several decades homeless on the streets or dead from homeless on the streets -- no joke.
Bluntly, there just is no significant demand for solving ILP problems in practice. The "managers" don't want to get involved.
Selling pizzas from the back of a truck? Sure -- might sell 100 pizzas a day. Selling solutions to ILP and other NP-complete problems -- f'get about it.
Uh, since there is no significant demand for saving $51,900,000.00 with a bound of $55 million, there stands to be not significantly more demand for saving $51,937,228.21.
Thus, there stands to be no significant value for QC for solving NP-complete ILP problems. Sorry 'bout that. If some people want to get the $51,900,000.00 savings, they've been able to do that for decades and have voted loud and clear "We don't care.".
E.g., in one of my attempts, a guy sent me an ILP problem, we talked, and two weeks later I had running code that in 900 seconds on a slow computer got a feasible solution guaranteed to be within 0.025% of optimality. The problem had 600,000 variables and 40,000 constraints. I had done the work for free. Still, then, suddenly he was not interested.
So be it.
There was another one: I was writing the code using the idea of a strongly feasible basis, and suddenly the customer was not interested and returned to some not very good heuristic code he had.
Better, a lot better, to sell something a lot of people actually want, e.g., a lot better to sell pizza.
And I am doing a startup that to me continues to look good, software running, but it has nothing to do with NP-complete or ILP and wouldn't be helped by QC.
So, to me, e.g., even if Google gets a good QC that can solve ILP problems, then I don't believe that they will have many customers or much of a business and there will be no big reason for IBM or Microsoft to worry.
Since there is no significant demand for using ILP to save money now, I don't see a significant demand for using QC on ILP to save money in the future.
Their employees might be better off selling pizzas. Let's see: From some of my arithmetic about costs of pizza, can do well for $2-3 a pizza. From a pizza truck in a good location might be able to sell the pizzas for an average of $10 each, e.g., an extra $1 for anchovies! Might sell 100 pizzas a day for $1000 a day, maybe 20 days a month. Looks like a better career than QC research!
If there is no demand for pizzas, then there won't be much demand for pizzas with anchovies.
Uh, the Google QC researchers are well paid? Terrific -- park the pizza truck near the Google QC research building!!!!
For some parts of US national security, the situation for a good QC might be significantly different -- I doubt it, but maybe.
pyinstallwoes|3 years ago
Negative, positive outlook is that it is a disinformation campaign so one may maintain the lead in a particular trajectory of technical dominance. Whilst doing so, as an extra game theoretic safety precaution which also amplifies the disinformation campaign is to fund any research in the direction of the disinformation campaign as both a distraction and 'impossibility canary.'
Quite... deliciously deceptive.
aqme28|3 years ago
sfpotter|3 years ago
Cthulhu_|3 years ago
posterboy|3 years ago
meltyness|3 years ago
If you're thinking that the whole purpose of QC will be quickly subsumed by wide algorithms with superpolynomial speedup, you might be missing the point. It's about how computers are built, not about stuffing one specific abstraction into another. If suddenly we discover we can build a machine that can generate random numbers a quadrillion times faster than any current hardware design, that's a new space in computation.
I mean consider how widely deployed the parallelism construct is now, and that Amdahl's law was elucidated in the 60's.
Parallelism was just one degree of freedom for us to climb the S-curve on, quantum computing seems to provide essentially a continuum of them.
sweezyjeezy|3 years ago
dr_dshiv|3 years ago
We are building user interfaces that make it easier to “play around” with quantum computing phenomena—especially with music and art—with the idea that our aesthetic sensibilities may help drive discovery.
oldgradstudent|3 years ago
How is it even remotely close?
Vacuum tubes were a thriving industry, producing many groundbreaking products and services.
wfn|3 years ago
https://quantumdelta.nl/ is some kind of hub but landing pages offer too much hype and too little content :)
thank you.
belter|3 years ago
"Separating Quantum Hype From Quantum Reality" - https://news.ycombinator.com/item?id=32691220
blast|3 years ago
https://archive.ph/cikWE
photochemsyn|3 years ago
An article with a little more depth might examine the future of trapped-ion quantum computing, for example:
https://en.wikipedia.org/wiki/Trapped_ion_quantum_computer
As far as the 'make money off new drugs' mentality, that's not really where QM chemical simulations in molecular dynamics really seems all that promising - it's more about things like the design of new catalysts to improve the efficiency of various industrial processes.
If QM computation is eventually developed, the devices will almost certainly be large and extremely expensive (kind of like the cutting-edge chip fab machines of today in scale). For most businesses, it's unlikely the benefit of owning one will justify the cost, so it'll probably be a national lab / research center type thing.
jillesvangurp|3 years ago
The key mechanism to protect inventions is patents. Patents have a limited shelf life. If you file a lot of patents today and it takes 30 years before you can apply them, they will have expired by then and others are free to take your inventions and build on that. So, if quantum computing requires another three decades to start making money, most of the companies that are currently being invested in will have failed and their patent portfolios and investment will be worthless. Their patents will have expired, their founding scientists will have moved on or retired, etc. At best those companies may be in a position to file more patents. So, any investors investing right now are making bets on how long it will take before there's a meaningful market to get an ROI and which companies are positioned best to take a chunk out of that market. The further that is out, the higher the risk of losing their investment.
There are billions flowing into quantum computing and the article is simply making the point that in terms of revenue potential there seems to be a lot of uncertainty about the practicality of current approaches, the lack of any real revenue (beyond consulting people on how awesome it would be if we had working quantum computing, etc.). And the lack of perspective on when all this will change. Very valid points. There are a few big companies investing in this stuff but none of them is betting their company on it. It's a side show at MS, Google, IBM, etc.
A long shot that might create some viable business decades further from now but if it all fails, their stocks will be fine. There's enough substance there for them to want to have a finger in the pie if it does take off but none of these companies seems to be counting on that happening any time soon.
drewbeck|3 years ago
robertlagrant|3 years ago
mattnewport|3 years ago
pyb|3 years ago
sgt101|3 years ago
The physics underpinning qc was arguably proven in the 2020's. It's not quite done (in the way that fusion was not quite done in the 50's) but there is a fairly clear set of demonstrations that QC's with error correction are possible. However the engineering barriers are fierce and there is still a possibility that they are insurmountable. In addition there are concerns that while QC will work the class of problems that is NP and also BQP may be very small. Even if a problem is in that group then it may be that the algorithms we have are not superquadratic or quadratic - meaning that the improvement that they offer over classical algorithms may be marginal.
Worse, there are often very good heuristic approaches to some of these problems which means that although a superquadratic QC approach would be an amazing breakthrough of computer science (genuinely amazing and worthy of accolades and prizes and fundamentally important for our understanding of the universe etc) it would offer only marginal economic value (possibly). Now, this is not true of some problems where there are exponential explosions and no good heuristics... but there is an even worse catch.. Which is that the quantum algorithms offer computer scientists fresh insight into what's tripping up the classical approaches. In this scenario it can be that an amazing breakthrough happens in QC, and someone uses that to get an insight that pushes the classical approach close enough to the QC approach as to render the QC approach marginal.
The theoretical picture is moving very fast though - so we will have to see.
On the other hand the practical side is moving more slowly. We see announcements that make one think that a Moore's law type of scaling is happening but hidden in the small print there are often (always as far I can decode) catches that mean that while the results look great they are still very much mired in problems. For example, are all the bits on a QC useable at once? Can they be used to form an actual algorithm? How long does the machine run for? How long does it take to start? Some of the answers are jarring - often only a small subset of a machine can be used in an actual problem solving episode; sometimes the machines run for a few steps only; sometimes the machines take 24hrs or longer to start.
It has taken 70 years to nearly build fusion reactors, it took 70 years to create mRNA vaccines. It may well take 70 years (from now) to build practical, valuable quantum computers. And something could go wrong on that path that just renders them moot.
throw149102|3 years ago
Furthermore, the paper itself links to a github repository[2] with a list of papers that either imply or use an exponential advantage in quantum chemistry. Now would be a good time to mention that I am not an expert in chemistry, nor have I read the entirety of this list of papers so I am not in a position to go through each and every one to decide how generic their results are or what the limitations are. Perhaps all these papers have fundamental limitations that prevent it from being useful in normal chemistry, only in weird souped-up problems specifically devised for a quantum advantage.
Either way, this paper is by no means conclusive on the subject. There's a ton of more research to be done in multiple fields to know for sure.
[1] https://arxiv.org/pdf/2208.02199.pdf [2] https://github.com/seunghoonlee89/Refs_EQA_GSQC
lumost|3 years ago
I can provision 1k CPU based servers or ~20 4x GPU based servers in a cloud computing environment for an hour for <$400. These are mature technologies with massive economies of scale behind them. A quantum computer needs to not only outperform scale out GPU/CPU performance on a particular problem set, it needs to crush it.
ak_111|3 years ago
In other words the size of funding QC is getting is nowhere close to the other hype bubbles and there are some significant peer-reviewed results that have been generated from it, so for the time being you can still give it the benefit of the doubt.
For example it has definitely enhanced our understanding of quantum chemistry and computational complexity, and anyone who invests time learning QC will end up having solid new insight about how the world works and deep engineering knowledge of electronics, which you can't say about many other bubbles.
For example, compare how many QC startups YC has funded (I think 0?) compared to blockchain, crypto, AI-assisted medicine and web3. There is no comparison. Picking on QC is far below my list if you want to have a go at hype bubbles.
trhway|3 years ago
>"The little revenue they generate mostly comes from consulting missions aimed at teaching other companies about 'how quantum computers will help their business,'"
well, that makes QC bona fide a tech industry.
pavlov|3 years ago
WhitneyLand|3 years ago
In the case of the Internet Bubble stocks were down 78%, but it was not hard to do well in the end given diversification and a long enough horizon.
In the case of the Dutch Tulip bubble there was no good ending for anyone except those who got out early.
Some bubbles like NFTs generate strong opinions but have yet to have final judgment from history.
I think the quantum computing bubble is different than all three, but closer to the Internet than to Tulips. In which case the conventional strategy would be to diversify and expect a long time horizon.
wikfwikf|3 years ago
This is untrue. If you were invested in what became the stars of that era: Amazon, Red Hat, Cisco, a few others, you eventually made decent money, although far worse than if you had stayed out and bought the dip.
If you had a diversified portfolio of 'new economy' stocks which didn't include a few winners like this, you might have lost over 95% of your money and never got it back. Lots and lots of stocks simply disappeared or were bought for peanuts. Many others, including lots of very very highly rated ones like Yahoo never exceeded their bubble-era peaks.
alpineidyll3|3 years ago
And that's it! The author of this article is 100% right. Markets are fully aware though, go ahead and try to short any publicly traded QC stock lol. You can't. There's no shares to borrow and no liquidity on puts....
samlavery|3 years ago
----- The reality is that none of these companies — or any other quantum computing firm, for that matter — are actually earning any real money. ----- ORLY? I guess I should go masssively short IBM shares then. https://newsroom.ibm.com/image/2022%20IBM%20Quantum%20Roadma...
---- Shor’s algorithm has been a godsend to the quantum industry, leading to untold amounts of funding from government security agencies all over the world. However, the commonly forgotten caveat here is that there are many alternative cryptographic schemes that are not vulnerable to quantum computers. It would be far from impossible to simply replace these vulnerable schemes with so-called “quantum-secure” ones. ---- ORLY? New cryptography can take 20 years or more to be fully deployed to all National Security Systems. NSS equipment is often used for decades after deployment. National security information intelligence value varies depending on classification, sensitivity, and subject, but it can require protection for many decades. -NSA
The solutions we do have do not work very well. Only the weakest FALCON-512 (bad name as it was only 64 bits of quantum security, now the dual lattice attack seems to reduce this to 20?), actually fits the TLS use case without breaking the internet. The signatures are just too big. Cloudflare has testing that proves this.
If that wasn't enough, this person is completely unaware of the annual survey of quantum researches that actually puts the arrivial of a cryptanalyically relevant quantum computer at 2030 or so. Peter Shor is actually one of the people polled in the survey, this person is not. And if you parts are still clean, you can look at the surveys estimates since 2018. These estimates are clearly trending towards sooner and sooner, instead of further and futher away.
If you still have doubts, read this: https://www.whitehouse.gov/briefing-room/statements-releases...
cs702|3 years ago
Quoting:
> Billions of dollars have poured into the field in recent years, culminating with the public market debuts of prominent quantum computing companies like IonQ, Rigetti and D-Wave ... These three jointly still have a market capitalisation of $3bn, but combined expected sales of about $32mn this year (and about $150mn of net losses), according to Refinitiv.
> The reality is that none of these companies — or any other quantum computing firm, for that matter — are actually earning any real money. The little revenue they generate mostly comes from consulting missions aimed at teaching other companies about "how quantum computers will help their business", as opposed to genuinely harnessing any advantages that quantum computers have over classical computers.
seydor|3 years ago
t_mann|3 years ago
Well, we do know that P <= BQP <= PSCPACE, and we have one important example that lies in BQP but not in P (for all we know). It's just not clear how important that particular example is for the kind of computing we do today, if it ever becomes practical. It looks like it'd rather result in a one-time nuisance for sysadmins, like Y2K was.
The hope was for applications in new areas like materials and drug design. The author has posted a link to one paper suggesting that we might not see exponential speedups in chemical simulations, but that's not an outright refutation either.
goethes_kind|3 years ago
A real performant quantum computer could potentially revolutionize a lot of industries. But selling it as an accelerator of molecular dynamic simulations is not quite as sexy.
snidane|3 years ago
https://scottlocklin.wordpress.com/2019/01/15/quantum-comput...
unknown|3 years ago
[deleted]
mikewarot|3 years ago
Operator Imprecision and Scaling of Shor’s Algorithm
https://arxiv.org/ftp/arxiv/papers/0804/0804.3076.pdf
mamonster|3 years ago
upofadown|3 years ago
mikewarot|3 years ago
I further posit that there are no quantum algorithms without binary equivalents.
[1] https://en.wikipedia.org/wiki/Quantum_supremacy
[2] https://en.wikipedia.org/wiki/Shor%27s_algorithm
bawolff|3 years ago
Because i find it really hard to believe there will ever be an O(sqrt(n)) classical algorithm for unstructured search. How could there possibly be?
tomthumb|3 years ago
bawolff|3 years ago
kvathupo|3 years ago
On utility, there's more than just Shor's: unstructured search [1], finance ([2], [3]). Even if quantum computers ultimately prove unfruitful commercially, that doesn't render it a useless endeavor. Like String Theory, it can beget findings in other areas, regardless of whether you can profit from them: novel classical recommendation algorithms ([4]), quantum algorithms for SAT that could possibly help automated theorem proving ([5]).
Part of the difficulty of quantum computing is that to show speedup, you need to find complexity bounds on classical problems whose runtime is actively being researched, e.g. neural networks ([6]).
As for their financial worthwhileness, while there is valid concern ([7], [8]), it's far too early to tell: it's hardware, not software. Also, it's my understanding that private investment is much larger than public funding in the US for quantum computing, both of which pale in comparison to China's investment. Thus, I wouldn't want to see investors shy away if the government is unwilling to make up the difference!
[1] - https://en.wikipedia.org/wiki/Grover%27s_algorithm
[2] - https://arxiv.org/abs/1905.02666
[3] - https://arxiv.org/abs/1908.08040
[4] - https://scottaaronson.blog/?p=3880
[5] - https://cstheory.stackexchange.com/questions/36428/do-any-qu...
[6] - https://arxiv.org/abs/1912.01198
[7] - https://www.microsoft.com/en-us/research/project/topological...
[8] - https://arxiv.org/abs/2110.03137
rpz|3 years ago
cyberlurker|3 years ago
voxleone|3 years ago
timbit42|3 years ago
less_less|3 years ago
> The most prominent application by far is the Shor algorithm (opens a new window)for factorising large numbers into their constituent primes, which is exponentially faster than any known corresponding scheme running on a classical computer. Since most cryptography currently used to protect our internet traffic are based on the assumed hardness of the prime factorisation problem, the sudden appearance of an actually functional quantum computer capable of running Shor’s algorithm would indeed pose a major security risk.
> Shor’s algorithm has been a godsend to the quantum industry, leading to untold amounts of funding from government security agencies all over the world. However, the commonly forgotten caveat here is that there are many alternative cryptographic schemes that are not vulnerable to quantum computers. It would be far from impossible to simply replace these vulnerable schemes with so-called “quantum-secure” ones.
Note that Shor's algorithm breaks not just factoring, but also discrete log, including elliptic curve discrete log. That includes classic DH and DSA of course, as well as ECDSA and ECDH, whether they're over Bitcoin's curve, the other NIST curves, Brainpool, {curve,ed}{25519,448}, pairing-friendly curves, everything. Almost all broadly deployed public-key crypto uses RSA or elliptic curves. Those alternative public-key algorithms are still being worked out, and will take years to broadly deploy, so if a QC gets built, it will probably be able to break into straggling systems for some years. There is also a risk that the replacements will eventually fall to quantum or even classical attack, especially considering that a significant fraction of the proposed replacements already have fallen (most recently SIKE) or been weakened (eg, every multivariate quadratic sig). They may also have other security problems, eg implementation bugs or side-channel attacks.
The surviving quantum-secure algorithms are all either pretty inefficient (McEliece and SPHINCS+, and CSIDH and SQISign but those are also bleeding-edge), or use structured lattices (Kyber, Falcon, Dilithium, NTRU and NTRU prime, etc) or structured codes that look kind of like structured lattices (BIKE, HQC). So we'll have most of our eggs in just a couple of baskets again, and outside of applications that can use McEliece and SPHINCS+, they'll be newer, less-tested baskets. Also, while fast, the structured lattice and structured code systems still use significantly more bandwidth that elliptic curves.
Using long-term symmetric keys instead of or in addition to public-key crypto is possible in some applications, but it's obnoxious and limiting: you'd end up with some combination of Kerberos derivatives (with trusted third parties acting as single points of security failure), mailed smartcards or other secrets, and physical in-person meetings to set up shared keys.
So the bigger issue in my view is that outside of Bitcoin, breaking crypto is mostly a net negative for society. Transitioning to quantum-secure crypto is also a negative, in that it will take a ton of work and the replacements are less efficient than elliptic curves, and may have security problems. (It's also probably unavoidable because governments will try to build QCs to break crypto even if private industry doesn't.) So all this money is being spent on something whose first major application will be negative, if it even works at all. Hopefully the positive stuff will outweigh this.
wzwy|3 years ago
To write an article for the site, we would need to:
1. Write a headline with no mentions of any experts.
2. Write another headline mentioning at least one expert in it.
3. Write the content without mentioning any experts.
4. Write the content and sprinkle names of experts as needed.
5. Publish.
Now, the reader would then:
1. Be exposed to the no-experts version of the article - both headline and content.
2. Once finished, the reader will be prompted to write their thoughts on the article.
3. Click “Reveal”.
4. The reader would then skim or read the whole article again, but this time it would mention the experts.
5. Prompt the reader to evaluate how their thoughts had changed after reading the expert version of the article.
I’m so gullible, seeing experts in anything especially when names of prestigious institutions or titles are tacked onto them, tend to shut down the reasoning part of my brain altogether.
Bear in mind, the site I proposed is not a place to police how articles should be written; rather, it’s all about increasing its readers’ awareness on how much mentions of an authority can impact their initial reasoning and judgement and sometimes make them stop reasoning at all. My view is that mentions of an authority are useful for calibrating our judgements after we tried to reason on our own but not before that.
And yeah, I have no opinion on the original post. Just like to go off on a tangent once in a while.
bluecalm|3 years ago
The same goes for health advice but this time it's 99.9%+ - if you're not in the field you can just listen and hope you are good at estimating who is more credible or likely to do better research or more truthful claims. Trying to evaluate them yourself is a recipe for being wrong and creating your own bubble.
If the article says: random guy X says quantum computing is a scam because Y there is nothing I can take away from it because it's very easy to make Y both incorrect and plausible sounding to me. If I know it's Oxford physics professor who makes the claim I can learn that Y is at least serious enough reason to not be easily dismissed.
Appeal to authority is bad as an argument when people knowledgeable in the field try to debate a certain point. In other cases it's very useful to know who makes the claims and very often it's the only thing that gives the claims credibility.
brutusborn|3 years ago
I've always wondered if the format of long form expert opinions could be replaced by a knowledge graph that is independent of the expert.
E.g. instead of article "Economist John rejects minimum wage"
Root node "Minimum wage is not the best solution to problem x" -> because -> <node to define problem>, <node to define alternative solutions> -> because -> <leaf nodes of studies or models>
In this way other experts could add to the graph and the differences between different branches of argument could be more easily compared or automatically updated. Articles could still be written, but could reference specific nodes or edges of the graph which adds clarity to the discussion.
fallingfrog|3 years ago
samwillis|3 years ago
arinlen|3 years ago
Basing the perceived quality of an article on an appeal to authority doesn't make much sense either.
The Royal Society's motto is literally "take nobody's word for it."
unknown|3 years ago
[deleted]
mandmandam|3 years ago
Someone hasn't been watching the cryptocurrency markets.
That's partly tongue in cheek. But there are countless examples of the market remaining irrational longer than one can stay solvent.
Witness the continued success of BTC and Ether, amid newer options that outperform them on every tech-related metric, often by many orders of magnitude. I conclude that marketing hype and the first mover advantage form the vast bulk of valuation in a novel tech that people don't understand.
This is not to take away from the author's point at all - I would hope that anyone who invests in quantum computing reads the criticism from an insider who can actually read the papers.
However, as irrational and harmful as it is, I don't expect BTC to drop to zero before the day quantum computing actually does follow through. Rationality really isn't our thing.
jongjong|3 years ago
Well said. Having worked in the blockchain space as a developer and founder since 2017, I've also come to the same conclusion. The formula for success is hype + first mover advantage - Aside from that; it's all about social climbing and politics around those projects.
It's surprising how long the first-mover advantage advantage lasts and it's weird to see that even developers who should know better are getting pulled into learning poorly designed (or outdated) technologies. They're conflating the financial achievements of projects with their technological achievements.
I guess that's what happens when big investors are laser-focused on making as much money as possible instead of also trying to drive innovation forward.
IMO, the inability to separate the two is a major reason why we have such significant financial bubbles in the tech sector.
bluecalm|3 years ago
It's really not a surprise it's not about tech when the goals and challenges were never technical.
On the other hand in areas where it is about tech we see superior one winning over established players all the time. Google, WhatsApp, AMD just to name three out of many examples in various times of this millennium.
unknown|3 years ago
[deleted]
Capira|3 years ago
What outperforms Bitcoin in terms of decentralization, scalability, and resilience?
dang|3 years ago
amai|3 years ago
quantum42|3 years ago
I say this because:
1. They say nothing about the breakthroughs in quantum error correction that is allowing IBM to promise a leap from 89 qubits today to 4,000 qubits in 2025 (still not enough on its own for a cryptographically relevant quantum computer - CRQC0, running Shor's algorithm for exponential speedup in breaking e.g. RSA 2048, which some research suggests would take 20M qubits including those for quantum error correction)
2. He did not mention Grover's algorithm which provides quadratic speedup (for time complexity of searching for a particular string in an unsorted list of N items) over their classical counterparts. However, even quadratic speedup is considerable when N is large.
3. He did not mention the breakthrough by University of Chicago researchers that showed multiple quantum computers can be entangled over tuned optical fibers to act as a single quantum computer. This still doesn't mean that we can go from 4,000 qubits to 20M by networking 5,000 of the quantum computers IBM promised for 2025, in 2025, but it provides a trajectory for networked quantum computing as a horizontal scaling strategy.
4. He did not mention the $100B allocated this year by the Whitehouse/Congress for CRQC research.
What is his motive in giving us such an incomplete story with such a skewed conclusion? Is he working for a hedge fund that is shorting some stock? Or is he just a lay person trying to sound intelligent by writing about a field where they're not sufficiently informed?
quantum42|3 years ago
[deleted]