top | item 46511620

(no title)

croemer | 1 month ago

What's the source for the energy per token? I guess this? https://www.theguardian.com/technology/2025/aug/09/open-ai-c... 18Wh/1000t is on the high end of estimates. But even if it's 10x less I agree this is pretty crazy usage.

discuss

order

ossa-ma|1 month ago

"The University of Rhode Island based its report on its estimates that producing a medium-length, 1,000-token GPT-5 response can consume up to 40 watt-hours (Wh) of electricity, with an average just over 18.35 Wh, up from 2.12 Wh for GPT-4. This was higher than all other tested models, except for OpenAI's o3 (25.35 Wh) and Deepseek's R1 (20.90 Wh)."

https://www.tomshardware.com/tech-industry/artificial-intell...

https://app.powerbi.com/view?r=eyJrIjoiZjVmOTI0MmMtY2U2Mi00Z...

https://blog.samaltman.com/the-gentle-singularity

causal|1 month ago

These numbers don't pass sanity check for me. With 4x300W cards you can get a 1K token DeepSeek R1 output in about 10 seconds. That's just 3.3Wh right? And that's before you even consider batching.