top | item 46211733

(no title)

billisonline | 2 months ago

"Tons of power generation?" Perhaps we will go in that direction (as OpenAI projects), but it assumes the juice will be worth the squeeze, i.e., that scaling laws requiring much more power for LLM training and/or inference will deliver a qualitatively better product before they run out. The failure of GPT 4.5, while not a definitive end to scaling, was a pretty discouraging sign.

discuss

order

No comments yet.