top | item 13556378

(no title)

elyase | 9 years ago

It can be argued that some algorithms like Random Forests don't impose a generalization penalty as you increase the number of parameters (forests).

discuss

order

apathy|9 years ago

RF is appallingly difficult to re-use for inference, though. At least with a DNN or CNN you can pop open the hood and see what the model is doing at various points.

Tradeoffs, tradeoffs everywhere. It's almost like traditional mathematical statistics has something to offer them fancy machine learners. (Breiman was a professor of statistics, after all... ahead of his time, but no less a statistician.)