(no title)
matjet | 4 years ago
Two simulations of chaotic systems (starting identically) with different step sizes will always diverge (The difference in eventual positions does not stabilise as steps are made smaller). For this reason, I am not even sure if infinitesimal steps would avoid divergence from (ideal) reality. Plus y'know, the whole issue of a simulation with infinitesimal steps never making ANY progress, regardless of how fast it runs.
Therefore, I conclude that infinite degrees of precision is not the issue or solution for numerical explanation of chaotic behavior.
i_cannot_hack|4 years ago
When you discretize a continuous equation for numerical analysis, you always make sure to use a consistent discretization. The point of a consistent discretization is that it can be proven its solution will converge to the exact solution of the continuous equation as the step size approaches 0.
Consistent discretizations are possible even for nonlinear equations.
Either way we are simply discussing limitations of a chosen numerical method, which doesn't really support the arguments in the article.