top | item 45486970

(no title)

a_wild_dandan | 4 months ago

I would bet that it's far lower now. Inference is expensive we've made extraordinary efficiency gains through techniques like distillation. That said, GPT-5 is a reasoning model, and those are notorious for high token burn. So who knows, it could be a wash. But selective pressures to optimize for scale/growth/revenue/independence from MSFT/etc makes me think that OpenAI is chasing those watt-hours pretty doggedly. So 0.34 is probably high...

...but then Sora came out.

discuss

order

yen223|4 months ago

Yeah, something we are confident about is that

a) training is where the bulk of an AI system's energy usage goes (based on a report released by Mistral)

b) video generation is very likely a few orders of magnitude more expensive than text generation.

That said, I still believe that data centres in general - including AI ones - don't consume a significant amount of energy compared with everything else we do, especially heating and cooling and transport.

Pre-LLM data centres consume about 1% of the world's electricity. AI data centres may bump that up to 2%

bluefirebrand|4 months ago

> don't consume a significant amount of energy compared with everything else we do, especially heating and cooling and transport

Ok, but heating and cooling are largely not negotiable. We need those technologies to make places liveable

LLMs are not remotely as crucial to our lives

blondie9x|4 months ago

You gotta start thinking about the energy used to mine and refine the raw materials used to make the chips and GPUs. Then take into account the infrastructure and data centers.

The amount of energy is insane.