I'm really excited by all of this free university level material flooding the web as I never even started college due to financial concerns (aka I didn't want to get any loans).
Do you know whether Prof. Ng has updated the material since the first run of the class?
We are still in the honeymoon phase of free, online university courses, so I think there's been relatively little criticism of what's available now, but I'll go for it: I was disappointed by the Coursera/Stanford ML class. It was obviously watered down, the homeworks were very (very) easy, and I retained little or nothing of significance.
In contrast, the Caltech class was clearly not watered down, and, as the material was much more focused (with a strong theme of generalization, an idea almost entirely absent from the Stanford class, as I recall) I feel I learned far more.
Another big difference: the Caltech class had traditional hour-long lectures, a simple web form for submitting answers to the multiple-choice* homeworks, and a plain vBulletin forum. The lectures were live on ustream, but otherwise, no fancy infrastructure.
So I think that some interesting questions will come up. Do we need complex (new) platforms to deliver good classes? For me, the answer right now is no -- what clearly matters is the quality and thoughtfulness of the material and how well it is delivered. Can a topic like machine learning be taught effectively to someone who doesn't have a lot of time, or who doesn't have the appropriate background (in CS, math)? Can/should it be faked? I don't think so, but I think there are certainly nuances here.
* Despite being multiple-choice, the homeworks were not easy -- they typically required a lot of thought, and many required writing a lot of code from scratch.
Don't miss out on the original(i.e. before coursera) Andrew Ng lectures, starting here: http://tinyurl.com/6uqeoo2
These are also mathematically more rigorous.
I took this course last term after reading an introductory book on machine learning and skimming through Andrew Ng's CS 229 lecture notes. I thought this class was particularly excellent at emphasizing the theoretical aspects of machine learning, as well as emphasizing some underlying themes (like avoiding overfitting with regularization and cross validation). The class didn't cover as many models and algorithms as many of the other ML classes, but I've found those relatively easy to learn with the intuition this course gave me.
I really liked the Google talk http://www.youtube.com/watch?v=AyzOUbkUf3M and there are a bunch of advances in machine learning mixing technologies, like inductive learning & genetic programming. The Google video also shows some combinations of techniques to make it learn much faster.
Fortunately I can find videos and whitepapers on all those subjects, but seems the libraries are all very much in 'the past'. Maybe I don't know about some, but is there a library/toolbox like Weka which implement all modern & old algorithms and allow you to play on datasets mixing and matching them? Maybe I just couldn't find that, but Weka seems to be too primitive for that?
Disclaimer: I majored in AI a long time ago and I understand most of these concepts, but I have never touched it after I finished, so I'm not up to date/aware of everything, so sorry if I missed a famous tool or something.
If you enjoyed Geoff Hinton's talk you will probably find the theano 'deep learning' library to be of use. Still undergoing quite a lot of iteration but powerful and you get to run your stuff on the GPU for added fun. http://deeplearning.net/software/theano. Incidentally Hinton gave another google tech talk in 2010 http://www.youtube.com/watch?v=VdIURAu1-aU.
Being a newbie in ML, I found intro video quite helpful, having difficulty to grasp the idea of training, why is it needed etc, I found Mostafa's explanations quite helpful. I have taken ML's by Ng as well and due to heavy use of stats I could not grasp it.
Now I am learning Stats by Prof.Thrun at Udacity I assume I will be able to grasp it in much better way.
p.s: anyone is trying to learn ML basics and wish to learn? Why not learn together and solve the interesting problems together? contact me via email given in profile
[+] [-] auston|13 years ago|reply
I'm really excited by all of this free university level material flooding the web as I never even started college due to financial concerns (aka I didn't want to get any loans).
[+] [-] mikhael|13 years ago|reply
We are still in the honeymoon phase of free, online university courses, so I think there's been relatively little criticism of what's available now, but I'll go for it: I was disappointed by the Coursera/Stanford ML class. It was obviously watered down, the homeworks were very (very) easy, and I retained little or nothing of significance.
In contrast, the Caltech class was clearly not watered down, and, as the material was much more focused (with a strong theme of generalization, an idea almost entirely absent from the Stanford class, as I recall) I feel I learned far more.
Another big difference: the Caltech class had traditional hour-long lectures, a simple web form for submitting answers to the multiple-choice* homeworks, and a plain vBulletin forum. The lectures were live on ustream, but otherwise, no fancy infrastructure.
So I think that some interesting questions will come up. Do we need complex (new) platforms to deliver good classes? For me, the answer right now is no -- what clearly matters is the quality and thoughtfulness of the material and how well it is delivered. Can a topic like machine learning be taught effectively to someone who doesn't have a lot of time, or who doesn't have the appropriate background (in CS, math)? Can/should it be faked? I don't think so, but I think there are certainly nuances here.
* Despite being multiple-choice, the homeworks were not easy -- they typically required a lot of thought, and many required writing a lot of code from scratch.
[+] [-] abhgh|13 years ago|reply
[+] [-] baotiao|13 years ago|reply
[+] [-] lightcatcher|13 years ago|reply
[+] [-] mfalcon|13 years ago|reply
I found some courses(I don't know if there are more):
Andrew Ng Stanford CS229: http://cs229.stanford.edu/info.html
Caltech(the one from the OP link): http://work.caltech.edu/telecourse.html
Tom Mitchell Carnegie Mellon: http://www.cs.cmu.edu/~tom/10701_sp11/
I'm considering following the Tom Mitchell course as it seems to go deeper into the details, also because it uses a pretty cool bibliography.
What do you think, am I making the right choice?
[+] [-] tluyben2|13 years ago|reply
Fortunately I can find videos and whitepapers on all those subjects, but seems the libraries are all very much in 'the past'. Maybe I don't know about some, but is there a library/toolbox like Weka which implement all modern & old algorithms and allow you to play on datasets mixing and matching them? Maybe I just couldn't find that, but Weka seems to be too primitive for that?
Disclaimer: I majored in AI a long time ago and I understand most of these concepts, but I have never touched it after I finished, so I'm not up to date/aware of everything, so sorry if I missed a famous tool or something.
[+] [-] utunga|13 years ago|reply
[+] [-] pknerd|13 years ago|reply
Now I am learning Stats by Prof.Thrun at Udacity I assume I will be able to grasp it in much better way.
p.s: anyone is trying to learn ML basics and wish to learn? Why not learn together and solve the interesting problems together? contact me via email given in profile
[+] [-] raju|13 years ago|reply
[+] [-] craig552uk|13 years ago|reply
[+] [-] greenonion|13 years ago|reply
[+] [-] ahlemk|13 years ago|reply
[+] [-] unknown|13 years ago|reply
[deleted]