top | item 18436654

(no title)

ramgorur | 7 years ago

>Just because you can prove convergence for convex functions does not mean you can't prove it for non-convex functions.

I don't understand. How do you prove a gradient descent is guaranteed to escape local minima?

discuss

order

heyitsguay|7 years ago

Convex functions aren't the only functions with a single local (and global) minimum - consider sqrt(|x|) for a simple 1d example.

ramgorur|7 years ago

Yes, that's true. But in optimization domain the concept of "convexity" is understood in terms of set, not always from the 2nd derivative of a function. Because you might have search spaces where you are not able to differentiate the objective function at all. In those cases the "convex" means a "convex set".