(no title)
oergiR | 10 years ago
Probabilistic models have never really gone away. This presentation by LeCun actually suggests embedding neural networks inside of various types of probabilistic models: factor graphs and conditional random fields. This is, for example, how speech recognition works: the output of a neural network is fed into a probabilistic model (a hidden Markov model).
jhartmann|10 years ago
However combining learning features with other systems is a very powerful approach and combining SVM's on top of the learned features of a Neural Network I would say is common. I personally am more interested in approaches like Deep Fried Convnets (http://arxiv.org/abs/1412.7149) that combine kernel methods as part of the Neural Networks themselves.
agibsonccc|10 years ago
Talking to some of the users of Recursive nets, they will be renaming them to tree rnns which should help clear up confusion a bit.
oergiR|10 years ago
I believe the best comparison between "CTC" (which is billed as recurrent neural networks without the HMMs) and the traditional approach is by people at Google, Sak et al, "Learning Acoustic frame labeling for speech recognition with recurrent neural networks", ICASSP 2015. (I can't find a PDF online.)
speechduh|10 years ago