(no title)
feifan
|
2 years ago
The part that Siri is bad at will be commoditized — someone will open-source a GPT-4-level language model. And Apple's moat will be being able to run that on-device with all the attendant benefits (privacy, zero marginal cost to the company, availability in more scenarios, etc)
jocaal|2 years ago
l33tman|2 years ago
Note I'm not suggesting you can pack the full knowledgebase of humanity into those 2GB of RAM, but the key feature of an edge AI is simply to understand instructions, something Siri and Ok Google struggle with at best..
ithkuil|2 years ago
I think this can be a scenario of converging incentives: on one side large models will incentivized hardware manufacturers to increase the memory available on the devices, while on the other sides model developers will be incentivized to trim the fat on the models and devise compression mechanisms that don't compromise quality too much.
It's not unthinkable to imagine a hand held device able to run full inference locally a few device generations in the future.
arthurcolle|2 years ago