I've been wondering about this failed Apple Intelligence project, but the more I think of it, Apple can afford to sit and wait. In 5 years we're going to have Opus 4.6-level performance on-device, and Apple is the only company that stands to benefit from it. Nobody wants to be sending EVERY request to someone else's cloud server.
I think there's a lot of false assumptions in that assertion:
- that a bunch of users won't jump ship if Apple stagnates for 5 years
- that a product based on a model with Q12026 SoTA performance would be competitive with products using 2031's models.
- that just having access to good (by 2025/2026 standards) models is the big thing that Apple needs in order for Apple Intelligence to finally be useful.
On that last point, I think the OS/app-level features are almost more important than the model itself. If the model can't _do_ anything, it doesn't really matter how intelligent it is. If Apple sits on their laurels for 5 years, would their OS, built-in apps, and 3rd-party apps have all the hooks needed for a useful AI product?
Have you tried running a reasonably sized model locally? You need minimum 24GB VRAM to load up a model. 32GB to be safe, and this isnt even frontier, but bare minimum.
A good analogy would be streaming. To get good quality, sure, you can store the video file but it is going to take up space. For videos, these are 2-4GB (lets say) and streaming will always be easier and better.
For models, we're looking at 100s of GB worth of model params. There's no way we can make it into, say, 1GB without loss in quality.
So nope, beyond minimal classification and such, on-device isnt happening.
--
EDIT:
> Nobody wants to be sending EVERY request to someone else's cloud server.
We do this already with streaming. You watch YouTube that is hosting videos on the "cloud". For latest MKBHD video, I dont care about having that locally (for the most part). I just wanna watch the video and be done with it.
Same with LLMs. If LLMs are here to stay, most people would wanna use the latest / greatest models.
---
EDIT-EDIT:
If you response is Apple will figure it out somehow. Nope, Apple is sitting out the AI race. So it has no technology. It has nothing. It has access to whatever open source is available or something they can license from rest. So nope, Apple isnt pushing the limits. They are watching the world move beyond them.
Assuming the rate of progress on AI stays the same:
1/ No, you don't get Opus 4.6 level on devices with 12Gb of RAM, 7B quantised models just don't get that good. Still quite good mind you, and I believe that the biggest advance to come from mobile AI would be apps providing tools and the device providing a discovery service (see Android's AppFunctions, if it was ever documented well): output quality doesn't matter on device, really efficient and good tool calling is a game changer.
2/ Opus 4.6 is now Opus 4.6+5years and has new capabilities that make people want to keep sending everything to someone else's cloud server instead of burning their battery life
The rest of the FAANG has invested very heavily in cloud while Apple seems to be a laggard. GCP, AWS and Azure are all publicly available products, and cloud at Netflix, Meta seems very mature for a private offering.
This is not a huge disadvantage in my opinion. Let the rest of big tech fight each other to death over cloud, while controlling a very profitable differentiated offering (devices+services). Apple keeps the M series HW out of data centers, even though it presents some very attractive performance/w and per-core numbers.
I think you're correct on it not being a disadvantage. Apple's competitors are the Android OEMs, Microsoft, and Dell. Apple Intelligence is a failure only in the sense that we hold Apple to a higher standard. Can anyone argue that Apple's AI implementation is more flawed than Microsoft? I don't think so.
Being able to search photos with queries like "show me photos of me and teeray" is pretty useful.
What I really want is my phone to transcribe all of my phone calls to a Notes document. Since it isn't recording an audio conversation, I don't think the consent laws come into play.
I often use Apple Intelligence to proofread emails before I send them. It's nice that it runs on device. I don't think I ever had a use case where it would have to use their Private Cloud though.
I do love that feature. As a parent I'm part of multiple group chat for different things. And it's nice to have a single summary instead of reading 50+ unread messages.
I'm a complete Apple ecosystem user-- I have a Mac, an iPhone, an Apple Watch, Apple earbuds, and an Apple TV, and I also pay reasonably close attention to their announcements and developments-- and I couldn't tell you a single Apple Intelligence feature. Nor do I ever use Siri except for setting kitchen timers.
I was wondering the same thing. I turned notification summaries off as they were less than useful, and I don't think I've stumbled across any other Apple Intelligence features apart from the laughable Image Playground or whatever it's called.
Yeah but this is how Apple has always done infrastructure/services. Their internal software teams are a mess. They constantly reinvent the wheel poorly, and then they charge a premium for exclusive access. Is anyone surprised by this?
I mean I think a lot of it is that they're not _really_ forcing it upon people. I think I've declined it maybe twice over the last two years. Meanwhile, Google are trying to crowbar bloody gemini in _everywhere_, and I gather Microsoft is doing ditto.
What's insane is that the market / users doesn't care, they're making more than ever... It's quite sad to see that vision pro, apple Intelligence and liquid glass were all failures and no one cared... I hope android makes a comeback against Apple in the US so they're forced to innovate.
Taken another way given apple’s enormous market reach, this could be seen as perhaps the most solid metric of actual consumer interest in ai and features ignoring hype.
Not sure. I'm a heavy AI user at this point. Oh, also a heavy Apple user and never once used an Apple AI thing since they released them. I don't even know what they released. It is complete failure of execution on their part.
eagerpace|20 hours ago
meatmanek|19 hours ago
bitpush|19 hours ago
A good analogy would be streaming. To get good quality, sure, you can store the video file but it is going to take up space. For videos, these are 2-4GB (lets say) and streaming will always be easier and better.
For models, we're looking at 100s of GB worth of model params. There's no way we can make it into, say, 1GB without loss in quality.
So nope, beyond minimal classification and such, on-device isnt happening.
--
EDIT:
> Nobody wants to be sending EVERY request to someone else's cloud server.
We do this already with streaming. You watch YouTube that is hosting videos on the "cloud". For latest MKBHD video, I dont care about having that locally (for the most part). I just wanna watch the video and be done with it.
Same with LLMs. If LLMs are here to stay, most people would wanna use the latest / greatest models.
---
EDIT-EDIT:
If you response is Apple will figure it out somehow. Nope, Apple is sitting out the AI race. So it has no technology. It has nothing. It has access to whatever open source is available or something they can license from rest. So nope, Apple isnt pushing the limits. They are watching the world move beyond them.
well_ackshually|19 hours ago
1/ No, you don't get Opus 4.6 level on devices with 12Gb of RAM, 7B quantised models just don't get that good. Still quite good mind you, and I believe that the biggest advance to come from mobile AI would be apps providing tools and the device providing a discovery service (see Android's AppFunctions, if it was ever documented well): output quality doesn't matter on device, really efficient and good tool calling is a game changer.
2/ Opus 4.6 is now Opus 4.6+5years and has new capabilities that make people want to keep sending everything to someone else's cloud server instead of burning their battery life
hackingonempty|18 hours ago
liuliu|19 hours ago
And that is exactly why it won't happen (like that).
candiddevmike|19 hours ago
Hasz|20 hours ago
This is not a huge disadvantage in my opinion. Let the rest of big tech fight each other to death over cloud, while controlling a very profitable differentiated offering (devices+services). Apple keeps the M series HW out of data centers, even though it presents some very attractive performance/w and per-core numbers.
4fterd4rk|19 hours ago
teeray|20 hours ago
criddell|19 hours ago
What I really want is my phone to transcribe all of my phone calls to a Notes document. Since it isn't recording an audio conversation, I don't think the consent laws come into play.
aurea|19 hours ago
yohannparis|19 hours ago
Legend2440|20 hours ago
The Siri+LLM features of Apple Intelligence aren’t launched yet, and the other features like notification summaries run on-device.
aurea|19 hours ago
verdverm|19 hours ago
https://www.macrumors.com/2026/01/30/apple-explains-how-gemi...
Analemma_|20 hours ago
Just a total failure of execution.
tgrowazay|19 hours ago
https://security.apple.com/blog/private-cloud-compute/
drcongo|20 hours ago
abeyer|20 hours ago
bobbylarrybobby|19 hours ago
dpoloncsak|20 hours ago
castral|20 hours ago
rsynnott|16 hours ago
siva7|20 hours ago
upmind|20 hours ago
jgalt212|20 hours ago
righthand|20 hours ago
fmajid|17 hours ago
wizardforhire|20 hours ago
eknkc|19 hours ago
andrewmcwatters|19 hours ago
[deleted]
brcmthrowaway|20 hours ago
[deleted]