neural nets at the same time require multiple passes through the data (epochs).
if we can train a model in one epoch jnstead of 10000 epochs thats a breakthrough!
True, but it sounds like you’re just shifting computation from training to inference. And I’m not sure that’s a very good trade off to make, you’re likely to predict on much more data than you trained on (e.g. ranking models at google, fb, etc)
not sure I get your point, both DNNs and SVMs require one forward pass for inference, so there is no difference.
if SVM model can converge in one epoch, how is it not less efficient than the status quo with DNNs?
sdenton4|5 years ago
eugenhotaj|5 years ago
somurzakov|5 years ago