(no title)
blueplanet200 | 2 years ago
This is distinct from classical computers, where you can describe a bit without needing to describe the other bits in the computer. You cannot describe a qubit (in a way that's computationally useful, at least) without describing all of them.
krastanov|2 years ago
To describe a set of classical bits completely, you need a probability distribution over the whole system (including possible classical correlations induced by the crosstalk that OP spoke about). That probability distribution is a stochastic vector that has 2^n real positive components if we have n bits.
To describe a set of qubits completely, you need at least a "quantum probability distribution", i.e. a ket that has 2^n complex components (I am skipping the discussion of density matrices).
Both the classical and quantum description of the bits requires exponentially many real numbers. This exponential behavior on its own is not enough to the explain "quantum computational advantage". The exponential behavior is well known in plenty of classical contexts (e.g. under the name "curse of dimensionality") and classically solved with Monte Carlo modeling.
Scott Aaronson's lecture notes cover that quite well.
At the end, the issue of crosstalk is modeled in a very similar way in the quantum and classical case, and if it forbids the existence of a quantum computer, it should forbid the existence of classical ones as well. In both cases it is the use of linear binary (or stabilizer) codes that solves that issue.
MobiusHorizons|2 years ago
If I have understood your argument correctly I don’t think the conclusion follows from the premise because crosstalk is not fundamental to classical computing. By that I mean that for a given logical cpu design, crosstalk can be (and is) minimized through physical design characteristics (eg grounding, wire spacing etc) without reducing the size of computation. The same cannot afaik be said of quantum computers, where the interaction between qbits is essential to computation