top | item 47113413

(no title)

accounting2026 | 7 days ago

I didn't read/hear it as reducing human life to 'training energy', but I don't like the comparison at the technical level.

Firstly, the math isn't even close. A human being consumes maybe 15 MWh of food energy from years 0 to 20. Modern frontier models take on the order of 100,000 MWh to train. It's a 10,000x difference. Furthermore, the human is actively doing 'inference' (living, acting, producing) during those 20 years of training and is also doings lots of non-brain stuff. Besides the energy math, it's comparing apples-to-oranges. A human brain doesn't start out as a blank slate; it has billions of years of evolutionary priors for language and spatial reasoning that LLMs have to teach themselves from scratch, so this could explain why a human can do some things cheaper. Also, the learning material available to a human is inherently created to be easily ingested by a human brain, whereas a blank LLM needs to build the capacity to process that data. Altman seems to hint at a comparison to the whole human evolution, but that seems unfair in the other direction, because humans and human evolution had to make discoveries from scratch and trial and error whereas LLMs get to ingest the final "good stuff". But either way you slice it, it's just not a good comparison, though not an 'inhuman' or immoral one.

discuss

order

WithinReason|6 days ago

A US resident consumes 76 MWh per year [0], so 1.52 GWh over 20 years. A single model can be trained once and used by millions. Therefore LLMs are ~10000x more energy efficient than humans.

https://ourworldindata.org/energy-production-consumption#per...

accounting2026|5 days ago

Your numbers are about how much is used also for transport etc. Sam's number were about what the human body itself uses for training, hence why I used the caloric consumption.