top | item 40962456

(no title)

soist | 1 year ago

There is no such thing as too much optimization. Early stopping is to prevent overfitting to the training set. It's a trick just like most advances in deep learning because the underlying mathematics is fundamentally not suited for creating intelligent agents.

discuss

order

rocqua|1 year ago

Is over fitting different from 'too much optimization'? Optimization still needs a value that is optimized. Over fitting is the result of too much optimization for not quite the right value (i.e. training error when you want to reduce prediction error)

soist|1 year ago

What value is being optimized and how do you know it is too much or not enough?