(no title)
newrotik | 3 years ago
One of Newton's method major selling points is that once it's close to a local minimum, under the right smoothness assumptions it essentially converges faster and faster to an exact optimizer - in practice you get "one more correct significant digit each iteration" once you are close enough. It's called Newton's method region of quadratic convergence, see [0] Theorem 14.1, p.3
[0] https://www.stat.cmu.edu/~ryantibs/convexopt-F16/scribes/new...
No comments yet.