Your usage of Siri today (probably on an old version of iOS) frankly has nothing to do with the article we are discussing. Sorry to say this but it is going to take time. Comparing the performance of a chatgpt running in a big data center with a model running locally on a phone device... give it a few years.
ninkendo|7 months ago
Siri needs to be taken out back and shot. The problem with “upgrading” it is the pull to maintain backwards compatibility for every little thing Siri did, which leads them to try and incorporate existing Siri functionality (and existing Siri engineers) to work alongside any LLM. Which leads to disaster, and none of it works and just made it all slower. They’ve been trying to do an LLM assisted Siri for years now and it’s the most public facing disaster the company has had in a while. Time to start over.
lxgr|7 months ago
Build a crude router in front of it, if you must, or give it access to "the old Siri" as a tool it can call, and let the LLM decide whether to return its own or a Siri-generated response!
I bet even smaller LLMs would be able to figure out, given a user input and Siri response pair, whether the request was resonably answered or whether the model itself could do better or at least explain that the request is out of capabilities for now.
mrheosuper|7 months ago
lxgr|7 months ago
Yes, but isn't that infuriating? The technology exits! It even exists, as evidenced by this article, in the same company that provides Siri!
At least I feel that way every time I interact with it – or for that matter my Google Home speaker, ironically made and operated by the company that invented transformer networks.