top | item 17849065

(no title)

mcilai | 7 years ago

Quite incredible that he was interested in NNs back in 1990. He closed this thread very well.

discuss

order

totoglazer|7 years ago

They were very in vogue at the time. This was just after backprop was coming into its own, and before ANNs totally were surpassed by SVMs, boosting and ensembles, etc.

nabla9|7 years ago

This was just before the second AI winter. It involved neural networks, prolog, lisp, fuzzy logic, Japan overtaking US in AI, etc.

Lots of good work with neural networks was done back then:

   A learning algorithm for Boltzmann machines
   DH Ackley, GE Hinton, TJ Sejnowski - Cognitive science, 1985

   Learning representations by back-propagating errors
   DE Rumelhart, GE Hinton, RJ Williams - nature, 1986

   Phoneme recognition using time-delay neural networks
   A Waibel, T Hanazawa, G Hinton, K Shikano, KJ Lang - Readings
   in speech recognition, 1990

projectramo|7 years ago

As all the other responses point out, NNs were red hot back then.

The interest in NNs was ignited (in part) by this double volume collection of essays called "Parallel Distributed Processing" edited by Rumelhart and McClelland.

Dean even cites them. And, if you read the contributors, it contains many (though not all) of the heavy hitters.

Reading back on it, it will sound very familiar. All the amazing breakthroughs: object recognition, handwriting recognition etc all seemed to be there. But all that rapid progress just seemed to stop. There was this quantum leap and then you were back to grinding out for even 0.1% improvement.

For those who stuck through the second winter, things obviously paid off.

The intro essay is online:

https://stanford.edu/~jlmcc/papers/PDP/Chapter1.pdf

mark_l_watson|7 years ago

From my perspective neural networks were a big thing in the late 1980s when I was on a DARPA neural networks tools panel for a year, and wrote the initial version of the SAIC Ansim neural network project. We had some great results using simple backdrop networks. Good times.

pimmen|7 years ago

They were very popular when they came out and until SVMs were introduced to the United States.

Then when the data explosion started during the 00s, it laid the groundwork for the NN comeback.

coldsauce|7 years ago

Weren't neural nets popular back then?

mcilai|7 years ago

That's a good point.

silverlake|7 years ago

I’m almost Dean’s age. My undergrad project was evolving NN with genetic algorithms. AI was popular, but funding died abruptly soon after.

dekhn|7 years ago

The early 90s were an interesting time for NNs and other machine learning systems. I remember getting really interested, but being told that "NNs with more than 1 layer can't really be trained", so I went into simulation rather than training. It's really great that GPUs and deep backprop arose to recover the stature of NNs.

plg|7 years ago

Not that incredible. Just about every CS / Psych / Cognitive Science Dept back then was into them. I did a project on NNs in my undergrad. Programmed in C. I’m sure thousands of others did as well.