top | item 44653535

(no title)

greyadept | 7 months ago

I would really like it if an LLM tool would show me the power consumption and environmental impact of each request I’ve submitted.

discuss

order

preciz|7 months ago

And each toilet flush you make should also have a Co2 calculation which should go against your daily carbon allowance.

evrimoztamur|7 months ago

Spending drinking water for toilet flushes is indeed a problem. Perhaps not CO2 measurements directly, but informing people in general of how much high quality water is wasted on flushes alone will hopefully bring more momentum into more efficient flushing mechanism and introducing grey water systems to new and old buildings alike. Good idea!

j-pb|7 months ago

people downvote your sarcasm, but if you do the calculations you're kinda right.

1Kg of Beef costs:

  - The energy equivalent of 60.000 ChatGPT queries.
  - The water equivalent of 50.000.000 ChatGPT queries.
Applied to their metric Mistral Large 2 used:

  - The water equivalent of 18.8 Tons of Beef.
  - The CO2 equivalent of 204 Tons of Beef.
France produces 3836 Tons of Beef per day,

and one large LLM per 6 months.

So yeah, maybe use ChatGPT to ask for vegan recipes.

People will try to blame everything else they can get a hold on before changing the stuff that really has an impact, if it means touching their lifestyle.

The LLMs are not the problem here.

stonogo|7 months ago

Toilets are already labeled with their usage rate.

jrflowers|7 months ago

This is a good point because being curious about energy usage is the same thing as advocating for an imaginary rule about energy usage

kingstnap|7 months ago

You can assume the API price is roughly proportional to electricity usage.

If you buy $10 in tokens, that probably folds into ~$3 to $5 dollars in electricity.

Which would be around 30 to 90 kWhr in electricity.

Depending on the source, it could be anywhere from ~500g/kWhr (for natural gas) and ~24g/kWhr for hydroelectric.

It's a really wide spread, but I'd say for $10 in tokens, you'd probably be in the neighbourhood of 1 kg to 40 kg of emissions.

What's a good thing is that a lot of the spread comes from the electricity source. So if we can get all of these datacenters on clean energy sources it could change emissions by over an order of magnitude compared to gas turbines (like XAi uses).

chessgecko|7 months ago

if you bought a nvidia h100 at wholesale prices (around $25k) and ran it 24/7 at commercial electric rates (lets say $0.1 per kwh), then it would take you over 40 years to spend the purchase price of the gpu in electricity. Maybe bump it down to 20 for data center cooling.

I don't think the cost of the ai is close to converging to the price of power yet. Right now its mostly the price of hardware and data center space minus subsidies.

dijit|7 months ago

I don’t think that you can make this assumption.

People are selling AI at a loss right now.

jeffbee|7 months ago

That would be ... thousands of time less useful than giving you the same information at the motor fuel pump. Unfortunately this isn't one of those situations where every little bit counts. There are 2 or 3 things you can do to reduce your environmental impact and not using chatbots isn't one of the things.

jiehong|7 months ago

Let’s call it GreenOps