(no title)
lwneal | 2 years ago
Probably also canonical are Goodfellow's Deep Learning [2], Koller & Friedman's PGMs [3], the Krizhevsky ImageNet paper [4], the original GAN [5], and arguably also the AlphaGo paper [6] and the Atari DQN paper [7].
[1] https://aima.cs.berkeley.edu/
[2] https://www.deeplearningbook.org/
[3] https://www.amazon.com/Probabilistic-Graphical-Models-Princi...
[4] https://proceedings.neurips.cc/paper_files/paper/2012/file/c...
[5] https://arxiv.org/abs/1406.2661
bodecker|2 years ago
[1] https://probml.github.io/pml-book/book1.html
[2] https://probml.github.io/pml-book/book2.html
rajko_rad|2 years ago
hintymad|2 years ago
javajosh|2 years ago
Note that I don't think it's a great idea to just "read through" an 800 page text book even if you can - you've got to do exercises and check your own knowledge or else you will be spinning your wheels.
mark_l_watson|2 years ago
hintymad|2 years ago