top | item 23888718

(no title)

so_tired | 5 years ago

> Human reading/talking/listening equivalent of 200 pages of text per day for 80 years would be just 13GB of raw data or 3B tokens

I am sympathetic to the "total life time input" argument.

But humans get a hand crafted curriculum inputs, evolved over 1000s of iterations, in a near-optimal language encoding.

Also, if unsupervised gets us in-dist, and DRL seems to be not-bad in search-out-of-dist.... then we are getting close ?

Certainly a x100 scaling of current techniques can get a useful enough machine that makes many human tasks trac-able?

(I am not getting into the Turing / AGI / skynet argument)

discuss

order

No comments yet.