top | item 37107263

(no title)

Talinx | 2 years ago

Does this hold up when taking quantum mechanics into account?

Let's assume you need at least m = n^2 particles for a physical system modelling a n by n matrix and model the change of the system from setting the state of the particles (to the matrix elements) to measurement by a finite number of interactions between particles (by exchanging a photon):

- a particle can interact with a particle of the heat bath

- a particle can interact with another particle of the m particles of the system

I guess this result holds up if the second interaction kind does not matter because the first interaction alone then takes a constant time for each particle. The whole thing becomes a massively parallel computation (with m threads).

But the second interaction should matter, otherwise how can the system capture/model dependencies between variables (I guess)?

My intuition would be that subsystems of particles get closer to the equilibrium by interaction with the heat bath and then two subsystems combine their wave functions to one by the second kind of interaction. You got subsystems that are in local thermal equilibrium that combine and split their wave functions and as time goes to t_0 the subsystems sizes that are in local equilibrium get larger and larger until they reach size m at time t_0. This does seem to take longer for more particles (not that massively parallel anymore). Anyone got any insight into how this scales?

(This only matters under the assumption that the number of photon exchanges (that each particle experiences) for each of the m particles is finite and constant (or gets larger with larger m) for a fixed temperature. I could easily have missed some things that could make these thoughts irrelevant.)

discuss

order

aifer4|2 years ago

These results probably would not hold in the same form for a quantum system. By a quantum system, I mean a system where the decoherence time is on the order of the other timescales present in the system (e.g. the correlation time). In fact, it would be much more difficult to engineer such a system, and we would not want one for this purpose; the results rely on convergence to a classical canonical equilibrium distribution, which has to be generalized in the quantum case, meaning it may not have the properties we want. Also, we would have to deal with the measurement backaction on the system in the quantum limit, which we definitely don't want. In the classical limit, where the energy is much larger than Planck's constant divided by the timescale of the system, this is not an issue. One more thing: our algorithms use continuous measurement of the system. For a quantum system, due to the quantum Zeno effect, the system would be effectively "frozen", so we would definitely not sample the full distribution.

rsp1984|2 years ago

> But the second interaction should matter, otherwise how can the system capture/model dependencies between variables (I guess)?

Keep in mind that the venerable (and enormously successful!) gradient descent method does not model dependencies between variables either and manages to find solutions too. It just has to iterate a bit on it -- actually not unlike the method presented here.

tnecniv|2 years ago

The coupling between variables is given by the (quadratic) potential of the system.

I think there is some confusion because they have two tiers of “particles” going on. The first is the masses coupled in the spring-mass system. In that system, each component of x is a particle. However, any specific vector x, in the parlance of thermodynamics, is a single microstate of the system. You can then form a macrostate, I.e., a distribution, of microstate particles, by considering an infinite (or near infinite) number of particles. The dynamics of the macrostate are given by the Fokker-Planck equation, where interactions where both interactions you mention come from the diffusion term only present due to connection with a heat bath.

So the n coupled masses are viewed as a single particle in an abstract system with (stochastic) gradient dynamics.