top | item 46986139

(no title)

Lio | 17 days ago

I think there's room for multiple approaches here.

Cloud based AI obviously has a lot of advantages e.g. batched proccessing on the best hardward, low power edge devices, data sharing, etc.

There's still room for local inference though. I don't know that I want "more context on me" all the time. I want some context, some of the time and I want to be in full control of it.

I'd pay for that. I don't think it will be for everyone but a number of people would pay a premium for an off shelf product that provides privacy and control that cloud vendors by their nature just can't offer.

discuss

order

alex43578|17 days ago

Definitely room for multiple approaches, including local LLMs.

But I just don't think for most users that local LLM capabilities will be a deciding factor in either hardware or OS choices.

A cloud subscription model will be the premium offering ($20 for consumers, $100 to $1000 or pay-per-token for businesses), and inevitably something ad-supported at a lower price or free for low-end consumers.

Once Joe Consumer has access to that subscription ChatGPT or free tier, are they really going to run a far-less-powerful model on their laptop? Outside of a few simple tasks like semantic search in your email, notes, photos; or localized transcription, local models will just be too far behind the curve for the public to make much use of them.