top | item 23491009

(no title)

dmvaldman | 5 years ago

AGI in text is < 3yrs away.

discuss

order

Barrin92|5 years ago

there's zero understanding in any of this. This is still just superficial text parsing essentially. Show me progress on Winograd schema and I'd be impressed. It hasn't got anything to do with AGI, this is application of ML to very traditional NLP problems.

dmvaldman|5 years ago

i think you are assuming that what is happening under the hood is that a human-inputted sentence is being parsed into a grammar. it is not.

chundicus|5 years ago

I'm skeptical. Amazing progress has been made in the last 5-10 years but it still feels like we need more paradigm shifting in the ML/AI field. It feels like we're approaching the upper limits of what stuffing mountains of data into model can do.

But with the speed of the field, maybe we can figure it out in three years. It just seems like we're still missing some key components. Primarily, reasoning and learning causality.

azinman2|5 years ago

What breakthrough occurred?

dmvaldman|5 years ago

Zero shot and few-shot learning in GPT-3 and lack of significant diminishing returns in scaling text models. Zero-shot learning is equivalent to saying "i'm just going to ask the model something that it was not trained to do"