Yes it does. It's kind of a fixed cost though, since we're going to feed and educate our youth anyway, unless Sam Altman would have those people to starve to death.
Has there ever been at time when a wealthy person would run their mouth like this without any fear of an angry mob tearing their limbs off their body? Maybe right before the French revolution?
This comparison only works if you assume scaling keeps paying off. Sara Hooker's research shows (https://papers.ssrn.com/sol3/papers.cfm?abstract_id=5877662) compact models now outperform massive predecessors and scaling laws only predict pre-training loss, not downstream performance. If marginal returns on compute are falling (https://philippdubach.com/posts/the-most-expensive-assumptio...), "energy per query" hides the real problem, a trillion dollars of infrastructure built on the bet that they won't.
This what I expect from a mid marketing team.....not a supposed visionary thought leader (/s).....
This is completely fallacious thinking that I assume is meant as a means of manipulating people who dont think deeply about the implications and procession of ideas that leads such obviously disingenuous intelluctual dishonesty....
Waste heat....is not the same as a biologically closed loop which microbes, bacyerium, myceliums, and plants and aninals all work in a concerted effort....
My Food becomes fertilizer....His waste becomes nothing of utility (unless they have amazing efficiencies that defy what we know about physics...)
It also just doesn't make sense. Like, we train a human and that takes 20 years of food.
To train an LLM it needed a collection of 800TiB of data (The Pile). To generate that pile, you needed millions to billions of humans. So did training the LLM now suddenly take 20.000.000 billion years of food or are we not allowed to make the same shitty comparison.
mitthrowaway2|7 days ago
asacrowflies|6 days ago
lisp2240|7 days ago
Gibbon1|7 days ago
b3ing|7 days ago
robbbed|6 days ago
random_duck|6 days ago
7777777phil|7 days ago
p0w3n3d|7 days ago
TrackerFF|7 days ago
eulgro|7 days ago
I couldn't find much on training AI models. Apparently GPT-3 used 1.3 GWh for training. So maybe ~10 GWh for newer models?
So... let's stop training humans I guess.
zipping1549|7 days ago
brnt|6 days ago
bravetraveler|7 days ago
kderbyma|7 days ago
This is completely fallacious thinking that I assume is meant as a means of manipulating people who dont think deeply about the implications and procession of ideas that leads such obviously disingenuous intelluctual dishonesty....
Waste heat....is not the same as a biologically closed loop which microbes, bacyerium, myceliums, and plants and aninals all work in a concerted effort....
My Food becomes fertilizer....His waste becomes nothing of utility (unless they have amazing efficiencies that defy what we know about physics...)
878654Tom|7 days ago
To train an LLM it needed a collection of 800TiB of data (The Pile). To generate that pile, you needed millions to billions of humans. So did training the LLM now suddenly take 20.000.000 billion years of food or are we not allowed to make the same shitty comparison.
la64710|7 days ago
/s