It is good that they kept the classical crypto along. However, the general tendency towards quantum-resistant cryptography leaves me puzzled. From my perspective as a physics PhD graduate, I firmly believe that a quantum computer capable of breaking public key crypto will never be built. This is because as you add more qubits, there's increased interference between them due to the additional connections required.
It's similar to how FM radio works: there's a main frequency and several sidebands. When you adjust the tuner to pick up a station, you're essentially "interacting" with the corresponding station. But if there are too many stations, you may no longer be able to hear the music, and as a result, there would be only a static noise present.
This leads me to a somewhat cynical conspiracy. Imagine the moment when a curios government agency realises that building a quantum computer for this purpose is a futile endeavor. Instead of admitting this, they could perpetuate the idea that its construction is just around the corner. Then, act as a wolf in sheep’s skin and introduce everyone to quantum-resistant solutions, which are unfortunate to have secret hidden backdoors by having done more advanced research on them. Has anyone thought about this?
Doesn't your argument apply to classical bits too? The more interconnected a classical bit is, the more parasitic coupling it will experience. That used to be an argument used against the feasibility of classical computers in the 40s (until von Neumann published work on fault tolerant classical computing).
Both classical and quantum computers (1) can not "scale" without error correction because of analog noise (although it is less crucial on the classical side), but (2) can be build with error correction codes integrated in them to overcome that noise.
Also, you do not need all-to-all connectivity between your qubits (or bits) to build a scalable quantum (or classical) computer.
Edit: To add to your FM radio metaphor: you can have way more FM radio channels if each channel is on a separate coax cable (or in physics speak, if you multiplex not only by frequency but by spacial mode). No need to have all your qubits be controlled by the same optical or microwave mode, you can have physically separate control lines for each qubit and then eliminating cross-talk is as simple as inverting an n-by-n matrix.
CRYSTALS-Kyber was designed by an academic team, not a government (though a government standards body refereed the competition that selected it; that competition, in turn, was driven by technical feedback that came predominantly from academic cryptography teams around the world). In general, with some exceptions (like curve isogenies), the "math" behind PQ crypto is pretty well established; it was just less attractive than curves were for performance reasons, but is now more attractive if you presume curves will be broken in 20 years by QC.
People have already said here most of what I want to say in this comment, but just to make it as explicit as possible:
Essentially the only reason anyone thinks that useful quantum computation is possible is because of things called threshold theorems, which state that as long as the noise in each qubit is less than some small but non-zero error rate you can add more qubits and use quantum error correction to make your computation arbitrarily precise. In other words as long as you're below the threshold rate quantum computers scale well.
Of course those threshold rates are very very small, and creating significiant numbers of qubits which are below the threshold rates is incredibly difficult, but theoretically it works.
I've heard physicists raise opinions like yours (i.e. QC will never be built for practical reasons), but I also hear ones that say the opposite. I'd err on the side of caution.
As for your conspiracy: The conclusion of that would be to continue using hybrid constructions.
Though, and I know crypto more than physics, I'd consider it as highly unlikely. Creating backdoors that others won't find is next to impossible. Why do I say that? Because we have some historic evidence how "NSA wants to plant a backdoor into an algorithm" works. Usually they've been discovered quickly. They can still be successful, because as we have seen with dual ec drbg, it was discovered quickly, yet nobody really cared and industry still used the algorithm.
But something like that won't happen in a transparent process. You can be sure that every expert in the field had a look at crystals-kyber. If anything about it was suspicious, we would know.
> I firmly believe that a quantum computer capable of breaking public key crypto will never be built. This is because as you add more qubits, there's increased interference between them due to the additional connections required.
Seems weird to be assuming what's possible based on current technical obstruction. If you trace CPUs development, or many other technologies, many people with deep technical knowledge were certain about some thresholds which we have long passed. This bias even had some name which I forgot.
To a first approximation, the US government uses the same cryptography that US consumers do -- AES, SHA-2, the NIST P curves, ECDSA, etc. are all categorized for various levels of data integrity and confidentiality within the government.
The same will be true of PQ signing schemes, meaning that a backdoor would be predicated on the USG believing that they have some truly remarkable NOBUS breakthrough in PQC. That feels unlikely to me; NSA interventions in cryptographic design have historically gone in the opposite direction[1].
(This is separate from the actual design question. I don't know as much about stateless PQ signing schemes, but the stateful ones are mostly "boring" hash-based cryptography that's well understood to be strong in both settings.)
The main twist is that we don't know the future but we know how theorical QCs are able to break currently used cryptography, with QCs algorithm like the Shor algorithm: https://en.wikipedia.org/wiki/Shor%27s_algorithm
"As such, as quantum technology advances, there is the potential for future quantum computers to have a significant impact on current cryptographic systems. Forecasting the future is difficult, but the general consensus is that such computers might arrive some time in the 2030s, or might not arrive until 2050 or later." quoted from:
The good news is that if you're willing to take a 2x slowdown for asymmetric encryption (which basically everyone is) you can get the best of both worlds by wrapping your quantum resistant crypto in regular crypto.
So you are claiming the protocol that Signal has adopted is already backdoored by the government. Extraordinary claims require extraordinary evidence. You need to provide some kind of evidence of this. We are talking 20+ years of open and public research on post-quantum cryptography.
If I was NSA, that's exactly the kind of fud (fear, uncertainty and doubt) I'd try to propagate.
While I appreciate and respect the useful criticism, I would be careful with this kind of content:
Regardless wether you read this before or authored it, if you're right then go ahead and show to the world how we're wrong, if you're wrong you're helping quantum attackers.
FM radio has frequency capture, interference there is not additive as in AM. You're very likely to capture the strongest station unless there's a lot of interference.
As for QC, we don't seem to be running into any limits yet. It may well be that any limits are well above useful amount of qbits.
I assume you’re not proposing some kind of interference limit in principle?
Are you suggesting that limiting interference will be a practical dead end that is prevents advancement?
Either way that would be a pretty significant claim. There are lots of research directions being pursued and plenty of smart people think it’s worth trying.
Can you explain how qubits are physically implemented in a real-world computer? I just cannot wrap my mind around what they're made of and how they operate in the physical reality.
Given that Signal's main innovation (compared to traditional end to end encryption) was to safeguard its users against future compromises via the ratchet protocol, this actually seems like a logical move for them to make.
I don't think these incentives make sense at all. Government organizations suspected to be developing quantum computers probably have larger annual budgets than 20 billion. The ability to undermine virtually all cryptographic systems is unquantifiably large.
Once the cat is out of the bag, everyone will rush to post-quantum cryptography and all that value will be lost in a relatively short period. Indeed, we already witnessed this in the 2010s following the Snowden revelations when big tech, in a concerted effort, adopted HTTPS. Now that is the standard.
For example, "The Fiscal Year 2022 budget appropriation included $65.7 billion for the National Intelligence Program, and $24.1 billion for the Military Intelligence Program."
It's worth way more than $20B USD to have a working quantum computer that nobody knows about. You don't burn a weapon like that by inducing everyone to update immediately.
This is a myopic take, the attacker could not spend the bitcoins because of the public ledger and the value of bitcoin would drop to nothing once it is realized that wallets are not secure. They'd burn bitcoin for no gain, for a loss even, because they would reveal their capabilities and maybe even who they are.
Bitcoin is already quantum attack resistant, unless you use un-hashed public keys or reuse Bitcoin addresses (as some do).
If Bitcoin would become vulnerable, its value would collapse to zero overnight once it's known. There is limited amount of money anyone could extract before the value collapses.
My understanding is that bitcoin addresses are quantum safe as long as you do not reuse an address after spending funds sent to it [0]. Per the linked article, this is standard practice, so I would assume the majority of addresses are actually quantum safe.
And for more context: with p2pkh addresses, you are sending to the hash of the address, and hashes are quantum safe.
Why not use something like backchannel? That way we wouldn't need phone numbers either...
The initial shared private key exchange could be done with more expensive, quantum resistant cryptography but the actual communication could be done through symmetric encryption.
> The initial shared private key exchange could be done with more expensive, quantum resistant cryptography but the actual communication could be done through symmetric encryption.
That's exactly how Signal's symmetric ratcheting works (with the addition of an asymmetric ratcheting step to limit the forward impact of a key compromise on either side, which you can't do symmetrically).
> For the key exchange itself ("PAKE")
A PAKE requires a short shared secret. That does not usually exist in Signal's scenario.
That is very well-written, as someone else pointed out, though this common explanation for laypeople needs work (I'm not blaming Signal's blogger, who wrote it more carefully than most):
"Instead of bits as in a classical computer, quantum computers operate on qubits. Rather than 0 or 1, qubits can exist in a superposition of states, in some sense allowing them to be both values at once."
'Instead of beads as in a classical abacus, our Quabacus operates on Quabeads! Rather than positions 0 or 1, quabeads can be in both positions at once!'
Beads that are simultaneously in both positions sounds like a f$@!#g annoying glitch and not a feature - how does that help anyone record or calculate numbers? ('Would someone take a look a this broken Quabacus-abacus and resolve these g#$%!m glitching quabeads?!!!') It mocks the non-technical reader, who assumes they must have been given enough information to understand why it's faster and possibly how it works, but can't figure it out.
They have not been given enough. Does anyone who perhaps understands it better than I do want to take a stab at a new commonplace explanation, one that connects the dots between quantum superposition and performance (for certain calculations)?
You are correct. That is a horrible explanation. And incomplete.
* The state of a qubit has two basis states. Arbitrary states are linear combinations (=superposition) of these basis states. Importantly, the coefficients in the linear combination can be positive or negative.
* The state of N qubits has 2^N basis states. poly(N) gate can put the state of the N qubits in which each of the 2^N basis states has a non-zero coefficient.
* In quantum algorithms, we play with the coefficients. Basically, certain problems (like factoring) have structure such that you can change the coefficients so that the ones corresponding to the correct answer have magnitude approximately 1, while other coefficients have magnitude 0. This is enabled by the fact that coefficients are positive and negative, and so by mixing them, you can cancel some out.
* When you measure the qubits, the random answer you get is proportional to the coefficients squared. Hence, with high probability you get the right answer.
Where does the speed up come from? When you put the state of N qubits in the linear combination of 2^N basis states, and run it through the problem, that is akin to computing all answers in parallel. Like for factoring, checking all possible divisors to see which ones work. But to extract the answer out in the way described above, you have to do additional processing. This only works for some problems, with the right structure.
There isn't really a great way of explaining quantum behavior using everyday (classical) terms. Any analogy you come up with will be deeply flawed yet unsatisfactory opaque to the reader.
The only way is to gear up on math to the level where you can if not reason within the theory then to at least make sense of its presented conclusions.
You're frustrated because it's a bullshit answer based on a lack of understanding of what a superposition of states represents and what quantum computing is calculating. Qbit superpositions represent probability distributions between 1 and 0. And quantum computers calculate probable ranges of correct results of certain types of problems. They can produce these ranges of probable correct results using much less memory and time then it would take a classical computer to calculate similar results. We're talking exponential improvement of using 1 more qbit compared to 1 more classical bit. For example, in 2019, Google used a 53 qbit quantum computer to complete a task in 200 seconds that Google claimed would take 10,000 years to finish with the best known classical computer at the time. (Note that IBM claims that it would only take 2.5 days on their classical computer, but you can see the scale of improvement is huge either way)
The shortest possible answer is that qubit states are modeled as two-dimensional vectors on the complex unit sphere. We arbitrarily designate two orthonormal vectors on this sphere as corresponding to classical states 0 and 1. If the qubit vector isn't in the 0 or 1 state, it's in some linear combination of them. This is called superposition. Since most people don't know what linear combination means, superposition is explained as "sort of both at the same time". Upon measurement the qubits are collapsed to 0 or 1 with some probability proportional to how close they are to the 0 and 1 states. The precise probabilities are given by something called the Born rule. I gave a longer talk aimed at computer scientists if you're interested beyond this explanation: https://youtu.be/F_Riqjdh2oM
I don't really understand your objection, that description seems like about as well as you can do when trying to summarize quantum mechanics in one sentence.
If current quantum computers were scaled up to more qubits, could they break modern crypto? Or would we need both more qubits and a new quantum computer architecture?
I wonder how possible it is that IBM could have already gone further and are already cracking modern crypto in secret, e.g. funded by the NSA. Is that a crazy conspiracy idea, or actually a possibility?
Actively resisting future attackers and hardware is an incredibly forward-thinking thing to do, bravo. How long into the future is an achievable and desirable duration for encryption (barring any rapid, unforeseen paradigm shift)? If ten years is acceptable for declassification of standard documents in the US, is this a reasonable target for day to day signal chats?
Maybe we need a statue of limitations for encrypted data to help with future proofing/make the collection useless in a court of law? If you go to lengths to encrypt your data, there should be some current and future expectation of privacy around it, even if someone can decrypt it.
> How long into the future is an achievable and desirable duration for encryption (barring any rapid, unforeseen paradigm shift)?
I don't think "years of expected security" (as used to be popular for e.g. RSA key lengths for some time) is a meaningful metric anymore:
AES-256 and elliptic curve encryption are resistant against classical attackers until beyond the heat death of the universe, so their "time of security" is, for practical purposes, infinite.
I'd expect that, for quantum-safe asymmetric algorithms as well as for AES, there is a similar number corresponding to fundamental pyhsical infeasibility, and then we can also just pick that rather than any low or high number of years.
Post-quantum secure encryption can't be developed early enough. In the end all our data today will be at risk of being decrypted in the future - unless we secure it now!
That's not exactly the most compelling blog post. Way too many buzz words. Why would end-end cloud storage even be using public-key cryptography? You're sending the encrypted data to yourself forward in time.
>PQXDH provides post-quantum forward secrecy and a form of cryptographic deniability but still relies on the hardness of the discrete log problem for mutual authentication in this revision of the protocol.
So that's why active mitm with a contemporary quantum computer is a concern mentioned in the blog post. Of course it isn't of any concern currently (since no one has the hardware to exploit this), but I'm curious why they couldn't fit the crystals-kyber method for mutual auth in this hybridized implementation? performance concerns?
That's likely simply because they don't want to switch fingerprint formats again just yet. (They are currently in the process of upgrading the format for a non-cryptographic reason [1].)
Signal fingerprints, which users can manually verify in person or over a trusted channel, are just hashes over the public keys of both users involved – and if these keys change (e.g. due to a quantum upgrade), the format would need to change as well.
Update: Seems like that's actually due to a fundamental restriction of the quantum-safe primitives used and is addressed in the technical specification [2]:
> The post-quantum KEM and signature schemes being standardized by NIST [...] do not provide a mechanism for post-quantum deniable mutual authentication [...]
Seems like Signal's neat trick of using Diffie-Hellman in a three-way manner [3] doesn't work here, since the primitive used (FIPS 203, [4]) is only a key encapsulation method, and FIPS 204 only offers "regular" post-quantum signatures of the non-deniable kind.
Signal highly values deniability, and in this version they seem to have prioritized that in favor of quantum-safe mutual authentication.
Very well written and digestable. I have much respect for the Signal people.
However I'd like to mention using usernames instead of phone numbers has been met with the classic "soon™" response for years now. When will they actually do it? This the the only thing I really really dislike about Signal - their lack of communication on oddly specific things. Like them holding back the codebase for months on GitHub so as to not spoil the.. Surprise! MobileCoin!
About time! People picked this up soon after it was committed to the repository back in May, and the beta version of Signal has had dual keys aka "safety numbers" for a while now (maybe 1.5 months?). Happy to see they decided releasing a blog post about it after all :)
The blog doesn't mention it, but based on a code comment, it seems that ~two months from now the new key fingerprints will become mandatory for peers to remain trusted after you update your client
From the blog post:
> We want to extend our sincerest thanks and appreciation to all the people who contributed to the development of this protocol upgrade. This includes the cryptographic research community, the Kyber team, and the following people who directly contributed to our whitepaper
All that behind closed doors, apparently.
There was scarcely a mention of PQXDH to be found on the web besides the Signal source code and the handful of people that picked up on it on social media. A github ticket about adding post-quantum support was responded to with "we have no announcements to make" and then closed due to inactivity. I suppose one only needs so many cooks, but why not have this whitepaper, the ideas going into the protocol design, the timeline, whatever goes into this security decision for an open source app visible, even if only read-only? Feels more like source-available than open source spirited, but I guess that's in line with "the ecosystem is moving" (Moxie's talk where he says that they can do better without a community developing more clients, integrations, federation, etc.)
I am a bit puzzled: governments and big corp are pouring indecent amounts of money in developing quantum computers, which main application, afaict, is to break cryptography.
...and this is defeated by changing our algorithms ?
Whats the use in developing quantum computers then?
[+] [-] JanisErdmanis|2 years ago|reply
It's similar to how FM radio works: there's a main frequency and several sidebands. When you adjust the tuner to pick up a station, you're essentially "interacting" with the corresponding station. But if there are too many stations, you may no longer be able to hear the music, and as a result, there would be only a static noise present.
This leads me to a somewhat cynical conspiracy. Imagine the moment when a curios government agency realises that building a quantum computer for this purpose is a futile endeavor. Instead of admitting this, they could perpetuate the idea that its construction is just around the corner. Then, act as a wolf in sheep’s skin and introduce everyone to quantum-resistant solutions, which are unfortunate to have secret hidden backdoors by having done more advanced research on them. Has anyone thought about this?
[+] [-] krastanov|2 years ago|reply
Both classical and quantum computers (1) can not "scale" without error correction because of analog noise (although it is less crucial on the classical side), but (2) can be build with error correction codes integrated in them to overcome that noise.
Also, you do not need all-to-all connectivity between your qubits (or bits) to build a scalable quantum (or classical) computer.
Edit: To add to your FM radio metaphor: you can have way more FM radio channels if each channel is on a separate coax cable (or in physics speak, if you multiplex not only by frequency but by spacial mode). No need to have all your qubits be controlled by the same optical or microwave mode, you can have physically separate control lines for each qubit and then eliminating cross-talk is as simple as inverting an n-by-n matrix.
[+] [-] tptacek|2 years ago|reply
[+] [-] eigenket|2 years ago|reply
Essentially the only reason anyone thinks that useful quantum computation is possible is because of things called threshold theorems, which state that as long as the noise in each qubit is less than some small but non-zero error rate you can add more qubits and use quantum error correction to make your computation arbitrarily precise. In other words as long as you're below the threshold rate quantum computers scale well.
Of course those threshold rates are very very small, and creating significiant numbers of qubits which are below the threshold rates is incredibly difficult, but theoretically it works.
[+] [-] hannob|2 years ago|reply
As for your conspiracy: The conclusion of that would be to continue using hybrid constructions.
Though, and I know crypto more than physics, I'd consider it as highly unlikely. Creating backdoors that others won't find is next to impossible. Why do I say that? Because we have some historic evidence how "NSA wants to plant a backdoor into an algorithm" works. Usually they've been discovered quickly. They can still be successful, because as we have seen with dual ec drbg, it was discovered quickly, yet nobody really cared and industry still used the algorithm.
But something like that won't happen in a transparent process. You can be sure that every expert in the field had a look at crystals-kyber. If anything about it was suspicious, we would know.
[+] [-] comboy|2 years ago|reply
Seems weird to be assuming what's possible based on current technical obstruction. If you trace CPUs development, or many other technologies, many people with deep technical knowledge were certain about some thresholds which we have long passed. This bias even had some name which I forgot.
[+] [-] woodruffw|2 years ago|reply
The same will be true of PQ signing schemes, meaning that a backdoor would be predicated on the USG believing that they have some truly remarkable NOBUS breakthrough in PQC. That feels unlikely to me; NSA interventions in cryptographic design have historically gone in the opposite direction[1].
(This is separate from the actual design question. I don't know as much about stateless PQ signing schemes, but the stateful ones are mostly "boring" hash-based cryptography that's well understood to be strong in both settings.)
[1]: https://en.wikipedia.org/wiki/Data_Encryption_Standard#NSA's...
[+] [-] iksm|2 years ago|reply
"As such, as quantum technology advances, there is the potential for future quantum computers to have a significant impact on current cryptographic systems. Forecasting the future is difficult, but the general consensus is that such computers might arrive some time in the 2030s, or might not arrive until 2050 or later." quoted from:
https://datatracker.ietf.org/doc/draft-ietf-pquip-pqc-engine...
[+] [-] adgjlsfhk1|2 years ago|reply
[+] [-] keurrr|2 years ago|reply
[+] [-] coppsilgold|2 years ago|reply
[+] [-] pyinstallwoes|2 years ago|reply
[+] [-] philprx|2 years ago|reply
While I appreciate and respect the useful criticism, I would be careful with this kind of content:
Regardless wether you read this before or authored it, if you're right then go ahead and show to the world how we're wrong, if you're wrong you're helping quantum attackers.
[+] [-] yyyk|2 years ago|reply
As for QC, we don't seem to be running into any limits yet. It may well be that any limits are well above useful amount of qbits.
[+] [-] WhitneyLand|2 years ago|reply
Are you suggesting that limiting interference will be a practical dead end that is prevents advancement?
Either way that would be a pretty significant claim. There are lots of research directions being pursued and plenty of smart people think it’s worth trying.
[+] [-] Obscurity4340|2 years ago|reply
[+] [-] s17n|2 years ago|reply
[+] [-] swamp40|2 years ago|reply
I think it will be pretty obvious when someone gets a quantum computer working.
[+] [-] keurrr|2 years ago|reply
Once the cat is out of the bag, everyone will rush to post-quantum cryptography and all that value will be lost in a relatively short period. Indeed, we already witnessed this in the 2010s following the Snowden revelations when big tech, in a concerted effort, adopted HTTPS. Now that is the standard.
For example, "The Fiscal Year 2022 budget appropriation included $65.7 billion for the National Intelligence Program, and $24.1 billion for the Military Intelligence Program."
Source: https://irp.fas.org/budget/index.html
[+] [-] sneak|2 years ago|reply
[+] [-] rosywoozlechan|2 years ago|reply
[+] [-] nabla9|2 years ago|reply
If Bitcoin would become vulnerable, its value would collapse to zero overnight once it's known. There is limited amount of money anyone could extract before the value collapses.
[+] [-] xur17|2 years ago|reply
And for more context: with p2pkh addresses, you are sending to the hash of the address, and hashes are quantum safe.
[0] https://www2.deloitte.com/nl/nl/pages/innovatie/artikelen/qu...
[+] [-] baq|2 years ago|reply
[+] [-] artyom|2 years ago|reply
[+] [-] blurbleblurble|2 years ago|reply
The initial shared private key exchange could be done with more expensive, quantum resistant cryptography but the actual communication could be done through symmetric encryption.
https://www.inkandswitch.com/backchannel/
For the key exchange itself ("PAKE") maybe something like this: https://journal-home.s3.ap-northeast-2.amazonaws.com/site/ic...
And for the symmetric encryption: https://github.com/Steppenwolfe65/eAES
[+] [-] lxgr|2 years ago|reply
That's exactly how Signal's symmetric ratcheting works (with the addition of an asymmetric ratcheting step to limit the forward impact of a key compromise on either side, which you can't do symmetrically).
> For the key exchange itself ("PAKE")
A PAKE requires a short shared secret. That does not usually exist in Signal's scenario.
> And for the symmetric encryption: [...]
Why that over regular AES?
[+] [-] wolverine876|2 years ago|reply
"Instead of bits as in a classical computer, quantum computers operate on qubits. Rather than 0 or 1, qubits can exist in a superposition of states, in some sense allowing them to be both values at once."
'Instead of beads as in a classical abacus, our Quabacus operates on Quabeads! Rather than positions 0 or 1, quabeads can be in both positions at once!'
Beads that are simultaneously in both positions sounds like a f$@!#g annoying glitch and not a feature - how does that help anyone record or calculate numbers? ('Would someone take a look a this broken Quabacus-abacus and resolve these g#$%!m glitching quabeads?!!!') It mocks the non-technical reader, who assumes they must have been given enough information to understand why it's faster and possibly how it works, but can't figure it out.
They have not been given enough. Does anyone who perhaps understands it better than I do want to take a stab at a new commonplace explanation, one that connects the dots between quantum superposition and performance (for certain calculations)?
[+] [-] abdullahkhalids|2 years ago|reply
* The state of a qubit has two basis states. Arbitrary states are linear combinations (=superposition) of these basis states. Importantly, the coefficients in the linear combination can be positive or negative.
* The state of N qubits has 2^N basis states. poly(N) gate can put the state of the N qubits in which each of the 2^N basis states has a non-zero coefficient.
* In quantum algorithms, we play with the coefficients. Basically, certain problems (like factoring) have structure such that you can change the coefficients so that the ones corresponding to the correct answer have magnitude approximately 1, while other coefficients have magnitude 0. This is enabled by the fact that coefficients are positive and negative, and so by mixing them, you can cancel some out.
* When you measure the qubits, the random answer you get is proportional to the coefficients squared. Hence, with high probability you get the right answer.
Where does the speed up come from? When you put the state of N qubits in the linear combination of 2^N basis states, and run it through the problem, that is akin to computing all answers in parallel. Like for factoring, checking all possible divisors to see which ones work. But to extract the answer out in the way described above, you have to do additional processing. This only works for some problems, with the right structure.
[+] [-] varjag|2 years ago|reply
The only way is to gear up on math to the level where you can if not reason within the theory then to at least make sense of its presented conclusions.
[+] [-] ericjmorey|2 years ago|reply
[+] [-] ahelwer|2 years ago|reply
[+] [-] sebzim4500|2 years ago|reply
[+] [-] awestroke|2 years ago|reply
If current quantum computers were scaled up to more qubits, could they break modern crypto? Or would we need both more qubits and a new quantum computer architecture?
[+] [-] rjmunro|2 years ago|reply
I wonder how possible it is that IBM could have already gone further and are already cracking modern crypto in secret, e.g. funded by the NSA. Is that a crazy conspiracy idea, or actually a possibility?
[+] [-] Boogie_Man|2 years ago|reply
[+] [-] candiddevmike|2 years ago|reply
[+] [-] lxgr|2 years ago|reply
I don't think "years of expected security" (as used to be popular for e.g. RSA key lengths for some time) is a meaningful metric anymore:
AES-256 and elliptic curve encryption are resistant against classical attackers until beyond the heat death of the universe, so their "time of security" is, for practical purposes, infinite.
I'd expect that, for quantum-safe asymmetric algorithms as well as for AES, there is a similar number corresponding to fundamental pyhsical infeasibility, and then we can also just pick that rather than any low or high number of years.
[+] [-] Tutanota|2 years ago|reply
At Tutanota we also use the Signal protocol to build post-quantum secure encryption for email and drive: https://tutanota.com/blog/pqdrive-project
Post-quantum secure encryption can't be developed early enough. In the end all our data today will be at risk of being decrypted in the future - unless we secure it now!
[+] [-] bawolff|2 years ago|reply
[+] [-] fieldbob|2 years ago|reply
[+] [-] miles_matthias|2 years ago|reply
[+] [-] sigmar|2 years ago|reply
>PQXDH provides post-quantum forward secrecy and a form of cryptographic deniability but still relies on the hardness of the discrete log problem for mutual authentication in this revision of the protocol.
So that's why active mitm with a contemporary quantum computer is a concern mentioned in the blog post. Of course it isn't of any concern currently (since no one has the hardware to exploit this), but I'm curious why they couldn't fit the crystals-kyber method for mutual auth in this hybridized implementation? performance concerns?
[+] [-] lxgr|2 years ago|reply
Signal fingerprints, which users can manually verify in person or over a trusted channel, are just hashes over the public keys of both users involved – and if these keys change (e.g. due to a quantum upgrade), the format would need to change as well.
Update: Seems like that's actually due to a fundamental restriction of the quantum-safe primitives used and is addressed in the technical specification [2]:
> The post-quantum KEM and signature schemes being standardized by NIST [...] do not provide a mechanism for post-quantum deniable mutual authentication [...]
Seems like Signal's neat trick of using Diffie-Hellman in a three-way manner [3] doesn't work here, since the primitive used (FIPS 203, [4]) is only a key encapsulation method, and FIPS 204 only offers "regular" post-quantum signatures of the non-deniable kind.
Signal highly values deniability, and in this version they seem to have prioritized that in favor of quantum-safe mutual authentication.
[1] https://support.signal.org/hc/en-us/articles/360007060632-Wh...
[2] https://signal.org/docs/specifications/pqxdh/#active-quantum...
[3] https://signal.org/docs/specifications/x3dh/
[4] https://nvlpubs.nist.gov/nistpubs/FIPS/NIST.FIPS.203.ipd.pdf
[+] [-] jmprspret|2 years ago|reply
However I'd like to mention using usernames instead of phone numbers has been met with the classic "soon™" response for years now. When will they actually do it? This the the only thing I really really dislike about Signal - their lack of communication on oddly specific things. Like them holding back the codebase for months on GitHub so as to not spoil the.. Surprise! MobileCoin!
[+] [-] Aachen|2 years ago|reply
Pick your platform: https://mobile.twitter.com/Th3Zer0/status/166107804719636481... https://ch.linkedin.com/posts/dr-angie-qarry-397538127_add-k... https://chaos.social/@luc/111048883207848400 (disclosure: the latter is myself; there was another Mastodon post I'm pretty sure, but when I search for PQXDH there it only shows my own post)
The blog doesn't mention it, but based on a code comment, it seems that ~two months from now the new key fingerprints will become mandatory for peers to remain trusted after you update your client
From the blog post:
> We want to extend our sincerest thanks and appreciation to all the people who contributed to the development of this protocol upgrade. This includes the cryptographic research community, the Kyber team, and the following people who directly contributed to our whitepaper
All that behind closed doors, apparently.
There was scarcely a mention of PQXDH to be found on the web besides the Signal source code and the handful of people that picked up on it on social media. A github ticket about adding post-quantum support was responded to with "we have no announcements to make" and then closed due to inactivity. I suppose one only needs so many cooks, but why not have this whitepaper, the ideas going into the protocol design, the timeline, whatever goes into this security decision for an open source app visible, even if only read-only? Feels more like source-available than open source spirited, but I guess that's in line with "the ecosystem is moving" (Moxie's talk where he says that they can do better without a community developing more clients, integrations, federation, etc.)
[+] [-] sdeframond|2 years ago|reply
...and this is defeated by changing our algorithms ?
Whats the use in developing quantum computers then?
[+] [-] fieldbob|2 years ago|reply
[deleted]
[+] [-] m3kw9|2 years ago|reply
[+] [-] tjrgergw|2 years ago|reply