top | item 30985943

(no title)

plafl | 3 years ago

I like to explain that AI > ML > NN.

Now, NNs are the ones getting results at computer vision and natural language, and more. I think most people would say that other ML approaches are computational statistics. The goalpost for AI keeps moving.

If you are truly interested in the math of AI I think PAC Bayes learning is more appropriate and your book is Understanding Machine Learning [1] (not an easy read). A more gentle intro would be Learning From Data [2]. If someone recommends a book/paper it would be awesome, I'm always on the look.

[1] https://www.cs.huji.ac.il/w~shais/UnderstandingMachineLearni... [2] https://work.caltech.edu/telecourse

discuss

order

kvathupo|3 years ago

Although it tends towards Deep Learning as opposed to AI, I highly recommend Bishop's Pattern Recognition and Machine Learning. It not only provides a solid Bayesian perspective, but also comments on the subtleties of applying theory. For the latter, its discussion of overparameterization in (I think?) the first chapter comes to mind.

[1] - https://www.microsoft.com/en-us/research/people/cmbishop/prm...

akomtu|3 years ago

NN > ML. The proof is that nematode or fly, I dont remember, with its simple NN fully mapped, and still remaining a mystery how it works. ML, which is just a matrix multiplication at its core, is a laughably simplistic model of NN.

soVeryTired|3 years ago

GP means artificial neural network, not an actual nervous system.

Machine learning contains ANNs as a sub-discipline. Other non-ANN topics in ML include ensembled trees, Gaussian processes, and sampling theory.

kvathupo|3 years ago

I think one can't compare deep learning to machine learning since they have different purposes. While there have been great strides in the interpretability of NNs, the analytic models of classical machine learning are favored from a computational and interpretability perspective. Conversely, if the relationships in our data are constantly evolving (e.g. the change in the stochastic process followed by a time-series from one interval to another), then NNs are more appropriate.

jjtheblunt|3 years ago

linear-algebraic matrix multiplication complicated by non-linear threshold discontinuities evaluators, no?

MrBusch|3 years ago

Seconding this. "Learning From Data" is one of the best intros to machine learning theory I've seen through the years!