top | item 44149058

(no title)

sspiff | 9 months ago

Many of them try to be mindful of their climate impact.

I've tried to explain it in those terms as well: every medium-sized prompt on these large models consumes roughly one phone battery charge worth of energy. You have a phone with a calculator.

I'd ask them to do the math on how much energy they're wasting asking stupid things of these systems, but I'm too afraid they'd ask ChatGPT to do the math.

discuss

order

alainx277|9 months ago

Where are you getting these numbers from? I'm finding data between 0.3Wh and 3Wh for a 4o query, while a typical phone battery has 10Wh.

The energy consumption per prompt will decrease for the same task complexity, as inference hardware gets better (as they already have from the 2023 paper that reported 3Wh).