top | item 40032408

(no title)

liliumregale | 1 year ago

Regularization as a concept is taught in introductory ML classes. A simple example is called L2 regularization: you include in your loss function the sum of squares of the parameters (times some constant k). This causes the parameter values to compete between being good at modeling the training data and satisfying this constraint--which (hopefully!) reduces overfitting.

The specific regularization techniques that any one model is trained with may not be publicly revealed, but OAI hardly deserves credit for the concept.

discuss

order

No comments yet.