top | item 17572969

(no title)

breadAndWater | 7 years ago

None of the things you've mentioned are anything close to what I'm bringing up.

The point I'm making is that popular discussion of quantum effects are so wildly off-base, and have muddied the waters of even trying to understand what happens between photons and electrons, by casually reading about it.

But you see something like this emerge, and it's really obvious that solutions to these problems were on the right track even as far back as the early 1900's, only to be derailed by academics emerging in the 1940's.

Principles such as: https://en.wikipedia.org/wiki/Huygens%E2%80%93Fresnel_princi... had it right very early.

So, was there an ulterior motive to all the complex obfuscation of math, and inaccurate scientific reporting throughout the later 20th century? Or has it all been one big, innocent misunderstanding, among aloof egg heads distracted by their gigantic precious particle colliders?

One wonders.

discuss

order

XorNot|7 years ago

This is a massive conflation (and misunderstanding) of quantum mechanics. The processes being discussed in this paper are statistical and large, and thus pretty highly classical in nature.

The double-slit experiment, entanglement etc. are all concerned with what happens with individual particles to produce those statistics, and what that means.

For example, it's not remotely surprising defracted light can reconstruct an image of an object (this idea has been around for a while - i.e. evanescent wave fluoresence, or the quest for a negative refractive index material which would be beat out diffraction limits handling). But that's not why it's not surprising - it's not surprising because you can also detect the existence of solid objects without actually touching them with so much as a photon purely by letting the probability field of one potentially extend through them, and then observing whether you see diffraction patterns along the path where photons do travel (basically a highly biased double-slit experiment reveals whether it would've interfered well before a particle is ever likely to have hit the object obstructing one of the beam paths).

breadAndWater|7 years ago

The point being that computation is so cheap now, that it's more difficult to promote confusion and obscure facts.

It used to be expensive to compile massive data sets and reduce them to reliable statistical evidence, so it was easy to push concepts that had little supporting evidence. For example: "the particle passes through both slits", "the cat is alive and dead", "there are no hidden variables", "the source of an emission never ascribes state to its particles, and that state does not exist until inspected"

Now, such wild claims are in disagreement with rivers of data that are much more easily produced and reviewed computationally. Observations that were not previously possible now shed light on facts that were previously obscured. Without backing data ideas prone to confusion could take root. Particularly so, with voices of academic authority shouting down concepts that threaten the ivory tower.

But now, technology to conduct measurements is cheaper, and data shouts louder. So, something presented as fact in A Brief History of Time (the particle passes through both slits) can no longer be supported by fame alone, simply because the author is revered. It's easier produce and publish data (make high-fidelity video recordings of the behavior of silicone oil beads demonstrating pilot wave phenomena, and post on youtube).

On this example, pulling together raw data from sensor streams, and dumping into a high performance computing pipeline, reveals that diffraction itself is a state producing phenomenon, and that reliable variables are produced by the diffractor, but would be later hidden by subsequent polarizers that drive downstream state. If the hidden variables weren't reliable, there would be no possibility of composing an image from the statistical analysis of the diffraction. The diffraction would produce no reliable signal to reconstruct, because it would not exist, since local hidden variables are forbidden, behind no-go boundaries.

pjc50|7 years ago

> So, was there an ulterior motive to all the complex obfuscation of math, and inaccurate scientific reporting throughout the later 20th century

No of course not, why would you think that? It's a difficult subject and simplifying it for the lay reader is a lossy process.

moh_maya|7 years ago

Ah; thank you for the reply; I understand what you were trying to say. I'll leave my original comment in place for context for other readers. :)