top | item 44651661

Mistral reports on the environmental impact of LLMs

69 points| Kydlaw | 7 months ago |mistral.ai

57 comments

order

jeffbee|7 months ago

These conclusions are broadly compatible with "The Carbon Footprint of Machine Learning Training Will Plateau, Then Shrink" or, as I prefer, the PDF metadata title that they left in there, "Revamped Happy CO2e Paper".

Despite the incredible focus by the press on this topic, Mistral's lifecycle emissions in 18 months were less than the typical annual emissions of a single A320neo in commercial service.

https://arxiv.org/pdf/2204.05149

ACCount36|7 months ago

The press focus is a mix of the usual "new thing BAD", and the much more insidious PR work by fossil fuel megacorps.

Fossil fuel companies are damn good at PR, and they know well that they simply can't make themselves look good. The next best thing? Make someone else look worse.

If an Average Joe hears "a company that hurts the environment" and thinks OpenAI and not British Petroleum, that's a PR win.

greyadept|7 months ago

I would really like it if an LLM tool would show me the power consumption and environmental impact of each request I’ve submitted.

preciz|7 months ago

And each toilet flush you make should also have a Co2 calculation which should go against your daily carbon allowance.

kingstnap|7 months ago

You can assume the API price is roughly proportional to electricity usage.

If you buy $10 in tokens, that probably folds into ~$3 to $5 dollars in electricity.

Which would be around 30 to 90 kWhr in electricity.

Depending on the source, it could be anywhere from ~500g/kWhr (for natural gas) and ~24g/kWhr for hydroelectric.

It's a really wide spread, but I'd say for $10 in tokens, you'd probably be in the neighbourhood of 1 kg to 40 kg of emissions.

What's a good thing is that a lot of the spread comes from the electricity source. So if we can get all of these datacenters on clean energy sources it could change emissions by over an order of magnitude compared to gas turbines (like XAi uses).

jeffbee|7 months ago

That would be ... thousands of time less useful than giving you the same information at the motor fuel pump. Unfortunately this isn't one of those situations where every little bit counts. There are 2 or 3 things you can do to reduce your environmental impact and not using chatbots isn't one of the things.

jiehong|7 months ago

Let’s call it GreenOps

jiehong|7 months ago

So, using the smallest model for the task would help, as expected.

A very small model could run on device to automatically switch and choose the right model based on the request. It would help navigate the difficult naming of each model of each vendor for sure.

potatolicious|7 months ago

> ” A very small model could run on device to automatically switch and choose the right model based on the request.”

This is harder than it looks. A “router” model often has to be quite large to maintain routing accuracy, especially if you’re trying to understand regular user requests.

Small on-device models gating more powerful models most likely just leads to mis-routes.

evrimoztamur|7 months ago

What is the levelised cost per token? As in how we calculate levelised cost of energy.

If we take the total training footprint and divide that by the number of tokens the model is expected to produce over its lifetime, how does that compare to the marginal operational footprint?

My napkin math says per token water and material footprints are up 6-600% and 4-400% higher respectively for tokens on the order of 40B to 400M.

I don't have a good baseline on how many tokens Mistral Large 2 will infer over the course of its lifetime, however. Any ideas?

kurthr|7 months ago

Within marginal error, dollars=destruction.

Even if the company is "green" they make money, they pay employees/stockholders, those people use the money to buy more things and go on vacations in airplanes. Worse, they invest the money to make more money and consume more goods.

Even your gains and vegetables are shipped in to feed you, if you walk to the grocery store. You pay rent/mortgage for a house built with concrete and steel. The highest priced items you pay for are also likely the most energy and environmentally costly. They create GDP.

It's a little weird with LLMs right now, because everything is subsidized by VC, Ads, BigCo investment so you can't see real costs. They're probably higher than the $30-200/mo you pay, but they're not 10x the price like your rent, car payment, food, vacation, investment/pension are.

wmf|7 months ago

It's sad to see the French of all people fall for guilt-trip austerity thinking. Just decarbonize the grid and move on. Energy is good.

eric-burel|7 months ago

It's nice to see the French engineers collecting quantitative data to back future decisions thanks to our solid environment agency (the Ademe) work and public-private cooperation.

_squared_|7 months ago

France has a pretty low-carbon grid (relatively) already, this doesn't fix the water consumption issue though, which is getting more problematic as climate change cause droughts all over the country.

djoldman|7 months ago

They report that the emissions of 400 output tokens, "one page of text," equates to 10 seconds of online video streaming in the USA.

So I guess one saves a lot of emissions if one stops tiktok-ing, hulu-ing, instagram reel-ing, etc.

dr_kretyn|7 months ago

This is a fantastic report. As someone tasked to get the most of AI at our company, in conversations I'm frequently getting questions about it's environmental impact. Great to have a reference.

austinjp|7 months ago

This is interesting but I'd love it if they'd split training and inference. Training might be highly expensive and conducted once, while inference might be less expensive but conducted many, many times.