top | item 45466755

(no title)

idkwhattocallme | 5 months ago

The bubble bursts when Apple announces it's doing good enough (private/secure) LLMs on device. At that point the capex on cloud infra starts to come into question and the dominos start to fall...

discuss

order

fao_|5 months ago

Google's been doing this since at least 2022 and... well, nobody really cares.

goalieca|5 months ago

LLM to me are the least interesting part of AI. Deep learning has proven very useful for signal processing and image segmentation among other things. Those are small enough to run on phones. LLM simply don’t seem to be that useful at small scales because the illusion of knowledge falls apart with too few parameters.

kelipso|5 months ago

Yeah, it’s only a matter of time till LLMs can easily be run locally by more people and once the market realizes that it’s over.

gtirloni|4 months ago

You can't run anything close to ChatGPT levels of parameters locally though.

mscbuck|5 months ago

As laughable as Apple's efforts have been so far, I think they still have an advantage precisely because of the unified architecture.