top | item 46987436

(no title)

rubyn00bie | 17 days ago

I don’t see how AI won’t end up running on personal devices. It’s like how mainframes were the original computing platform and then we had the PC revolution. If anything, I think Apple is uniquely positioned to pull the rug on a lot of these cloud models. It might take ten or 15 years, but eventually we’ll see an arms race to do so. There’s too much money on the table, and once cloud providers are tapped out the next logical step is home users. It also makes scaling a lot easier because you don’t need increasingly expensive, complex, and power hungry data centers.

It wasn’t that long ago (ignoring the current DRAM market shenanigans) that it was unthinkable to have a single machine with over terabyte of RAM and 192 physical cores. Now that’s absolutely doable in a single workstation. Heck even my comparatively paltry 96GB of RAM would’ve been absurd in 2010, now there are single prosumer GPUs with that.

discuss

order

alex43578|17 days ago

With the rate of progress (and in the opposite direction, the physical limitations Intel/AMD/TSMC/ETC are bumping into), there's no guarantees about what a machine will look like a decade from now. But, simple logic applies: if the user's machine scales to X amounts of RAM, the hyperscaler's rack scales to X*Y RAM and assuming the performance/scaling relationship we've seen holds true, it will be correspondingly far smarter/better/powerful compared to the user's AI.

Maybe that won't matter when the user is asking it a 5th grade question, but for any more complex application of AI than "what's the weather" or "turn on a light", users should want a better AI, particularly if they don't have to pay for all that silicon sitting around unused in their machine for most of the day?

einr|17 days ago

This argument would sound nearly identical if you made it in the 70s or early 80s about mainframes and personal computers.

It's not that mainframes (or supercomputers, or servers, or the cloud) stopped existing, it's that there was a "good enough" point where the personal computer was powerful enough to do all the things that people care about. Why would this be different?*

And aren't we all paying for a bunch of silicon that sits mostly unused? I have a full modern GPU in my Apple SoC capable of throwing a ridiculous number of polygons per second at the screen and I'm using it to display two terminal emulator windows.

* (I can think of a number of reasons why it would in fact turn out different, but none of them have to do with the limits of technology -- they are all about control or economic incentives)