top | item 36576647

(no title)

pooloo | 2 years ago

You really just answered this yourself:

> Assume a computer would need several minutes to simulate everything that would've happened in the stick. I clearly got the output faster than a computer (and with more precision), so does this imply I'm doing anything particularly fascinating?

You assert that you have output faster than a computer with more precision. However, you do not have any empirical data, just observable data; as stated by zdragnar:

> The difference between you breaking a stick and the computer modeling it is that you've measured nothing. You don't know, with any precision, the amount of force you used, the rate the stick broke at, how much mass remains in the two pieces and how much was lost to splintering, etc.

Then you further state that you can measure those with a ruler and a scale; however, this inherently takes time with significant uncertainty in your measurements and calculations. Whereas a computer will provide all of those numbers.

The other thing to consider is the method of simulation such as finite-element analysis (FEA) and the resolution you need. You can get segmented data all the way to down a specific volume of that stick, good luck with the hand calculations on that.

discuss

order

fnordpiglet|2 years ago

I don’t think you guys are tackling this right. You can observe the stick breaking to absurd precision, even using an electron tunneling microscope, and produce immense amounts of data about the sticks state at every point in the breaking and afterwards faster than a computer can simulate breaking the stick. The key is the stick breaking and it’s observation at any level of detail happens in parallel with all aspects of the system coherently acting in real time simultaneously in a way with perfect effects and transmission of information between all quantum parts of the system at the speed of light. The computer on the other hand is operating in a binary encoding Turing machine forced to rely on imperfect relatively serial (compared to the parallelism of reality) with encodings that and algorithmic mathematical models that produce encodings that are similar to what reality produces naturally. The issue is the layers of abstraction, nor the measurement and detail of the system. All the measurable information about the stick and it’s breaking is available and measurable with no impact on the time it takes to break the stick.

Here’s perhaps a more clear example. The universe influences itself at a distance with all bodies acting on all other bodies at once. This is a classical nbody problem where n->inf. We can measure quite easily these effects without perturbing the system, which would not be able to progress a single iteration in the entire span of the universe if simulated by a computer.

ithkuil|2 years ago

But the comparison is not fair.

Yes the electron microscope can image a lot of details "in parallel" but not all details from all angles, all internal microfractures. You can't easily measure all temperature gradients in all cubic nanometers of the material etc etc.

The simulation is slow because it works at that level and thus as a side effect will also give you that output.

Obviously if you don't need all that information you may find another route to arrive at the results you want