top | item 44801819

(no title)

acc_297 | 6 months ago

Yes but the task becomes that much harder - we are scaling up natural gas generation not to phase out coal but simply to meet demand that wouldn't exist without the fierce competition to build the biggest LLM. Any feasible plan made 5 years ago which may have worked to transition a large industry from fuel burning energy sources to electricity generation (renewable or otherwise) is made 10x harder by the introduction of this rapid rollout in datacentre capacity.

discuss

order

jeffbee|6 months ago

I'm not sure if that's true of not. As the article indicates, training for AI is naturally demand responsive. Training can move on the clock, and it can move around the world to minimize carbon footprint. See the PDF I just love to share: "Revamped Happy CO2e Paper" https://arxiv.org/pdf/2204.05149

acc_297|6 months ago

Agree on training. But that google paper was written when the only image model available for broad public consumption was dall-e 2 and video models were more than a year away. It gets a mention in a more recent 2024 paper [1] which goes into detail about how inference rather than training creates the difficult to manage energy load which grids struggle to meet. If consumer interests and demands drive the trend in what companies offer in terms of inference capability then it's fair to worry that the impact on sustainability goals will be an afterthought.

[1] https://dl.acm.org/doi/pdf/10.1145/3630106.3658542