top | item 45055726

Arbitraging Down LLM Inference to the Cost of Electricity

6 points| ycombyourhair | 6 months ago |inference.net

4 comments

order
[+] srbhr|6 months ago|reply
> Someone with cheap electricity can serve inference profitably at prices that would bankrupt the current small set of centralized providers.

Strong claim, There's a similar article about China is eating the world on HN. Maybe then, they're the ones who can take over the inference cloud?

[+] skeptrune|6 months ago|reply
Why is there nowhere on the site where I can figure out how much I would get paid by adding my GPU to the network?
[+] srbhr|6 months ago|reply
That's exactly I was thinking.
[+] nick779|6 months ago|reply
frontier labs are still making crazy margins. OSS models are impossible to host serverlessly at a profit even if you're Together