Apple have been shipping various pieces of advanced AI in their consumer products for many years already. As the technology has allowed it, they have started moving even larger models to the edge - siri is now available on-device, for example.
They have been building dedicated hardware into their devices specifically for this - for years! State of the art LLMs have already been shown to run on ios devices.
I very much doubt that apple is behind - more likely just polishing off their next usecase.
I watched a video of Emad Mostaque (Stability AI CEO) recently; he alluded to discussions with Apple, but who knows to what degree. The general thrust being we need to have greater control over our data and that's one of the things Stability is working towards, which chimes with Apples overall approach.
Historically speaking, Apple is known for adopting new things after others have iterated on them a bit and then doing them somewhat better.
Edit: hey, is there any progress on speech recognition? Haven't heard any hype about it. Apple could improve how Siri deals with non native accents in english.
It will be interesting to see how much Siri is improved this year with iOS 17. My guess is they give a modest update that isn’t as mind-blowing as ChatGPT but is impressive for the realistic-sounding voice from all of their text-to-speech improvements and some incremental improvements for trivia and knowledge base questions. I bet the big leap forward for Siri handling more complex tasks is still at least a version away.
The $20/mo that OpenAI can't remotely be covering server costs. If Apple licenses GPT-4 for a lot of money, Siri will be much improved, without having to wait several generations.
[+] [-] supermatt|2 years ago|reply
They have been building dedicated hardware into their devices specifically for this - for years! State of the art LLMs have already been shown to run on ios devices.
I very much doubt that apple is behind - more likely just polishing off their next usecase.
[+] [-] neilalexander|2 years ago|reply
A few fun everyday examples:
- Facial recognition and the image classification/search in the Photos app
- Live Text recognition in images and videos
- Voice recognition by Siri (i.e. identifying different HomePod and Apple TV users)
- Realtime colour adjustments and other camera image processing
- Noise reduction and voice isolation microphone modes
[+] [-] deepandmeaning|2 years ago|reply
I watched a video of Emad Mostaque (Stability AI CEO) recently; he alluded to discussions with Apple, but who knows to what degree. The general thrust being we need to have greater control over our data and that's one of the things Stability is working towards, which chimes with Apples overall approach.
[+] [-] unknown|2 years ago|reply
[deleted]
[+] [-] kalimanzaro|2 years ago|reply
[+] [-] selimthegrim|2 years ago|reply
[+] [-] nottorp|2 years ago|reply
Historically speaking, Apple is known for adopting new things after others have iterated on them a bit and then doing them somewhat better.
Edit: hey, is there any progress on speech recognition? Haven't heard any hype about it. Apple could improve how Siri deals with non native accents in english.
[+] [-] ChatGTP|2 years ago|reply
[+] [-] wunderland|2 years ago|reply
[+] [-] zamnos|2 years ago|reply