top | item 47135079

(no title)

emregucerr | 5 days ago

People dismiss this as a meme too quick but I think this is a good thought experiment not only for drawing a comparison for energy consumption but learning efficiency. AI is often criticized for its low learning efficiency but if you compare it to a human it's not looking too bad. Let's say a human becomes an AGI-level learner by the time they are 14yo. Human vision is approx. 500 megapixels and that is approx. 1.7 Gb per second of vision data. That means it takes approx. 800 PETABYTES of data to 'pre-train' a human to become a well-enough generalist learner. Take Llama 4 from Meta whose training data set consisted of 30 trillion tokens - his is equivalent of 120 Tb which is a mere 0.12 peta-bytes.

I am well aware this is a flimsy napkin math at best but I find comparing LLM models to humans with a more serious tone is fun and useful thought.

discuss

order

No comments yet.