Does anyone else have a hard time accepting these calculations? I don’t doubt the serious environmental costs of AI but some of the claims in this infographic seem far-fetched. Inference costs should be much lower than training costs. And, if a 100-word email with GPT-4 requires 0.14 kWh of energy, power AI users and developers must be consuming 100x as much. Also, what about running models like Llama-3 locally? Would love to see someone with more expertise either debunk or confirm the troubling claims in this article. It feels like someone accidentally shifted a decimal point over a few places to the right.
marginalia_nu|1 year ago
Article numbers line up better with CPU inference for ~1s.
Panzer04|1 year ago
gcr|1 year ago
Edit: Nah I’m convinced, look at table 1. Inference costs are around 20mL in a datacenter environment.
unknown|1 year ago
[deleted]
guitarlimeo|1 year ago
viraptor|1 year ago
FrojoS|1 year ago
(I can't access the article.)