As former quantum physicist, I find it little troubling to read "quantum theory has reached a dead end" in specific reference to the interpretation of quantum mechanics. Most quantum physicists could not care less about how quantum mechanics is interpreted when it makes highly accurate quantitative predictions, and there are still plenty of interesting open problems for quantum theory (e.g., related to the practical design of algorithms and hardware for quantum computers).This article also misses what is likely the leading interpretation of quantum mechanics by actual quantum physicists, namely that the measurment problem is solved by decoherence (the quantitative theory of how classical states emerge from quantum states):
https://en.wikipedia.org/wiki/Measurement_problem#The_role_o...
https://royalsocietypublishing.org/doi/10.1098/rsta.2011.049...
adriand|4 years ago
Hawking espoused this idea he called “model dependent realism”. The idea is that every human understanding of reality is model-dependent, that is, it is not “reality” that we truly understand (we can’t) but rather in every case we have some model of reality that is useful in particular situations. For instance, we know that Newtonian physics are not “real” but they are perfectly accurate in certain situations. So they are not “wrong” when they are used in those situations, in fact, they are right.
The author of the article writes, “While Einstein won a Nobel Prize for proving that light is composed of particles that we call photons, Schrödinger’s equation characterizes light and indeed everything else as wave-like radiation. Can light and matter be both particle and wave? Or neither? We don’t know.”
In model dependent realism, we can ignore this apparent contradiction. In some situations the model of light as a particle is the most useful, and in others, the model where it is a wave is the most useful. We have to accept that it is not “really” either of these models, but that no matter what we do, any model we come up with for it will still just be a model.
gus_massa|4 years ago
But we know! The answer is neither.
Light and matter are weird things that is impossible to describe with usual language, but they can be described very precisely with math language. The problem is that the equations are too complicated and difficult to use.
They have been tested thoroughly, for example in particle accelerators but in experiments with very few things moving around. It's very difficult to use them when the experiment gets bigger.
In some cases, you can make some approximations and get almost the same result if instead of the full correct equations you use the wave equation. It's just an approximation. Light and matter are never waves, but in some case they can be approximated as waves.
In other cases, you can make some approximations and get almost the same result if instead of the full correct equations you use the particle equation. It's just an approximation. Light and matter are never particles, but in some case they can be approximated as particles.
And in other weird cases, bot approximations get very inaccurate predictions.
colordrops|4 years ago
"The Copenhagen Interpretation is sometimes called "model agnosticism" and holds that any grid we use to organize our experience of the world is a model of the world and should not be confused with the world itself. Alfred Korzybski, the semanticist, tried to popularize this outside physics with the slogan, "The map is not the territory." Alan Watts, a talented exegete of Oriental philosophy, restated it more vividly as "The menu is not the meal."
tluyben2|4 years ago
dorgo|4 years ago
Yes, we can only describe with models what can be observed. But it is a bad excuse for ignoring contradictions in (or between) models.
DebtDeflation|4 years ago
So has GR. Yet the two theories seem to be utterly incompatible.
japanuspus|4 years ago
To be a bit more specific as to how _decoherence_ solves this, one way to see it is that classicality (i.e. observables having specific values) is an emergent property in the limit of near-infinite degrees of freedom in the same way that e.g. thermodynamic properties (temperature etc.) are emergent properties of classical systems in the limit of near-infinite degrees of freedom.
Putting it on the edge, claiming that quantum theory is at a dead end is like claiming statistical physics is at a dead end.
One of my personal favorites for how to formalize this is the work on "pointer states" by Wojciech H. Zurek. There is a freely available Physics Today articls [0], and you can find surveys of further work e.g. in the introduction of [1].
[0]: https://arxiv.org/abs/quant-ph/0306072 Zurek, Decoherence and the transition from quantum to classical -- REVISITED [1]: https://arxiv.org/abs/1508.04101 Brasil, Understanding the Pointer States
konschubert|4 years ago
But could we then please stop teaching the collapse nonsense to first year students?
The logical inconsistencies of the collapse interpretation are an insult to their intellect.
luc4sdreyer|4 years ago
Can you explain or express this in a simpler way? Is it almost like saying macroscopic?
pdonis|4 years ago
I think "solved" is too strong. The Wallace paper you reference, for example, does not claim that decoherence solves the measurement problem. His claim is only the more modest one that understanding decoherence helps to clarify what the measurement problem actually is.
ravi-delia|4 years ago
l33tman|4 years ago
In essence the Copenhagen interpretation is still correct as a simplification that can still be OK in most cases. This is reflected by the fact that practising solid state physicists have successfully used this 20's style of QM for 100 years now.
michaelwilson|4 years ago
Maybe you meant to say "formerly paid to be a quantum physicist"? :-)
analog31|4 years ago
SinParadise|4 years ago
By disentangling himself with quantum physics. He was a quantum physicist, so I assume he knows how to do it.
shoyer|4 years ago