top | item 9714957

(no title)

oergiR | 10 years ago

Probabilistic models. Recent research often focuses on Bayesian models.

Probabilistic models have never really gone away. This presentation by LeCun actually suggests embedding neural networks inside of various types of probabilistic models: factor graphs and conditional random fields. This is, for example, how speech recognition works: the output of a neural network is fed into a probabilistic model (a hidden Markov model).

discuss

order

jhartmann|10 years ago

Actually, state of the art speech recognition has switched over to having a Recursive Neural Network directly run over the audio input. Take a look at the paper at http://arxiv.org/abs/1412.5567 and http://usa.baidu.com/deep-speech-lessons-from-deep-learning/

However combining learning features with other systems is a very powerful approach and combining SVM's on top of the learned features of a Neural Network I would say is common. I personally am more interested in approaches like Deep Fried Convnets (http://arxiv.org/abs/1412.7149) that combine kernel methods as part of the Neural Networks themselves.

agibsonccc|10 years ago

Not to nitpick. I just want people to realize there are actually recursive nets that rely on a parser to be built (this is the recursive net that relies on backpropagation through structure). Then there is the recurrent net (LSTMs,multimodal) that rely on backpropagation through time.

Talking to some of the users of Recursive nets, they will be renaming them to tree rnns which should help clear up confusion a bit.

oergiR|10 years ago

I know that Andrew Ng and colleagues say that they don't use HMMs. I haven't spoken with them (I haven't seen them at speech conferences) so I do not know whether they actually believe this themselves.

I believe the best comparison between "CTC" (which is billed as recurrent neural networks without the HMMs) and the traditional approach is by people at Google, Sak et al, "Learning Acoustic frame labeling for speech recognition with recurrent neural networks", ICASSP 2015. (I can't find a PDF online.)

speechduh|10 years ago

state of the art is still very much using WFSTs and DNNHMMs. IBM and Google are still beating baidu.