top | item 47113762

Training a Human Takes 20 Years of Food

18 points| Aldipower | 7 days ago |news18.com

21 comments

order

mitthrowaway2|7 days ago

Yes it does. It's kind of a fixed cost though, since we're going to feed and educate our youth anyway, unless Sam Altman would have those people to starve to death.

asacrowflies|6 days ago

I think your almost on to something with how these people think...

lisp2240|7 days ago

Has there ever been at time when a wealthy person would run their mouth like this without any fear of an angry mob tearing their limbs off their body? Maybe right before the French revolution?

Gibbon1|7 days ago

I don't think my parents and grandparents spent their lives working towards a future where grifters like Altman could take everything for themselves.

b3ing|7 days ago

In the Epstein files they talk about how to rid the world of poor people

robbbed|6 days ago

I get the feeling that this guy has never been punched in the mouth. Otherwise he might be more careful with what he says.

random_duck|6 days ago

You can remove "in the mouth".

7777777phil|7 days ago

This comparison only works if you assume scaling keeps paying off. Sara Hooker's research shows (https://papers.ssrn.com/sol3/papers.cfm?abstract_id=5877662) compact models now outperform massive predecessors and scaling laws only predict pre-training loss, not downstream performance. If marginal returns on compute are falling (https://philippdubach.com/posts/the-most-expensive-assumptio...), "energy per query" hides the real problem, a trillion dollars of infrastructure built on the bet that they won't.

p0w3n3d|7 days ago

You are the carbon they want to reduce

TrackerFF|7 days ago

I asked ChatGPT to do some napkin math, and it seems like on average it would take a human 13.75 million kcal worth of food, in those 20 years.

eulgro|7 days ago

That's 58 GWh, but considering each food calorie actually require 5-10 calories of input energy (oil mostly), let's say 290 GWh.

I couldn't find much on training AI models. Apparently GPT-3 used 1.3 GWh for training. So maybe ~10 GWh for newer models?

So... let's stop training humans I guess.

zipping1549|7 days ago

If I were one of his family, I'd advise him to hire someone to stop himself from saying things like this.

brnt|6 days ago

Training an LLM takes petabytes of theft.

bravetraveler|7 days ago

Now compare our waste and what might be extracted, psycho.

kderbyma|7 days ago

This what I expect from a mid marketing team.....not a supposed visionary thought leader (/s).....

This is completely fallacious thinking that I assume is meant as a means of manipulating people who dont think deeply about the implications and procession of ideas that leads such obviously disingenuous intelluctual dishonesty....

Waste heat....is not the same as a biologically closed loop which microbes, bacyerium, myceliums, and plants and aninals all work in a concerted effort....

My Food becomes fertilizer....His waste becomes nothing of utility (unless they have amazing efficiencies that defy what we know about physics...)

878654Tom|7 days ago

It also just doesn't make sense. Like, we train a human and that takes 20 years of food.

To train an LLM it needed a collection of 800TiB of data (The Pile). To generate that pile, you needed millions to billions of humans. So did training the LLM now suddenly take 20.000.000 billion years of food or are we not allowed to make the same shitty comparison.

la64710|7 days ago

Oh so sad … 20 years the corporate overlords have to wait for their minions to be ready…

/s