(no title)
JuettnerDistrib | 3 years ago
1. Procrastination seems to be a type of early stopping. I knew I had a good strategy in school!
2. Something that seems to be sorely missing in machine learning (I'm not a ML expert) are error bars. If you take the example of the figure at the end, as you increase the number of parameters in the model, your error bars become larger (at least in the overfitting regime), and they are infinite when you have more parameters than data points. Indeed, chi^2 tests are usually used in physics/astro to test for this. Of course, you need error bars on the data points to do this. So perhaps the difficulty is really in assigning meaningful uncertainties to your pictures/test scores/politicians.
visarga|3 years ago
In large neural nets the effect is reversed. The larger the model, the better it generalises, even from the same training data.
smartmic|3 years ago
Do you have some references for this claim? For me, it seems counterintuitive.