top | item 18966708

(no title)

luckyt | 7 years ago

It's really interesting that XGBoost failed when ran on a dataset that had no noise. I've also seen a similar thing occur with the Adam optimizer when training neural networks on perfect synthetic data [1]. Always interesting to take a dive down and understand why this is happening -- it gives you a glimpse into the internals of your algorithms.

[1]: https://datascience.stackexchange.com/questions/25024/strang...

discuss

order

svantana|7 years ago

I wouldn't call that comparable -- Adam gets 99.99% of the way, and this person is wondering why it doesn't go all the way. The answer of course is that it wasn't designed to. In this case XGBoost fails to do anything at all, which seems like a major bug.