top | item 47209969

(no title)

coldtea | 13 hours ago

If anything, memory ain't getting cheaper, disks aren't either, and as for graphics cards, forget it.

People wont be competing with even a current 2026 SOTA from their home LLM nowhere soon. Even actual SOTA LLM providers are not competing either - they're losing money on energy and costs, hopping to make it up on market capture and win the IPO races.

discuss

order

OtherShrezzing|12 hours ago

I don’t think anyone needs to compete with the LLM SOTA to get the benefits of these technologies on-device.

Consumers don’t need a 100k context window oracle that knows everything about both T-Cells and the ancient Welsh Royal lineage. We need focused & small models which are specialised, and then we need a good query router.

ezst|7 hours ago

We need them for what? Specialized models seem to provide a value comparable to what we've been doing with machine learning for eons, just more inefficient to train and to run.