(no title)
rococode | 2 years ago
It's always fun to mention that many of the foundational ideas in the modern wave of machine learning & neural networks stem from work done in the 50s and 60s: perceptrons, backpropagation, stochastic approximation, etc. were all explored in depth back then.
It was only after compute power scaled up enough to apply these techniques practically that they became revolutionary. Really makes you wonder what things people are working on right now that will also need to wait 30 years.
No comments yet.