I don’t see how AI won’t end up running on personal devices. It’s like how mainframes were the original computing platform and then we had the PC revolution. If anything, I think Apple is uniquely positioned to pull the rug on a lot of these cloud models. It might take ten or 15 years, but eventually we’ll see an arms race to do so. There’s too much money on the table, and once cloud providers are tapped out the next logical step is home users. It also makes scaling a lot easier because you don’t need increasingly expensive, complex, and power hungry data centers.It wasn’t that long ago (ignoring the current DRAM market shenanigans) that it was unthinkable to have a single machine with over terabyte of RAM and 192 physical cores. Now that’s absolutely doable in a single workstation. Heck even my comparatively paltry 96GB of RAM would’ve been absurd in 2010, now there are single prosumer GPUs with that.
alex43578|17 days ago
Maybe that won't matter when the user is asking it a 5th grade question, but for any more complex application of AI than "what's the weather" or "turn on a light", users should want a better AI, particularly if they don't have to pay for all that silicon sitting around unused in their machine for most of the day?
einr|17 days ago
It's not that mainframes (or supercomputers, or servers, or the cloud) stopped existing, it's that there was a "good enough" point where the personal computer was powerful enough to do all the things that people care about. Why would this be different?*
And aren't we all paying for a bunch of silicon that sits mostly unused? I have a full modern GPU in my Apple SoC capable of throwing a ridiculous number of polygons per second at the screen and I'm using it to display two terminal emulator windows.
* (I can think of a number of reasons why it would in fact turn out different, but none of them have to do with the limits of technology -- they are all about control or economic incentives)