top | item 40982483

(no title)

alexandercheema | 1 year ago

Do you mean with Apple Intelligence? You can already query models you host from Apple using exo or even just local on-device inference.

discuss

order

gnicholas|1 year ago

Does this work with Siri? I'm not running the beta so am not familiar with the features and limitations, but I thought that it was either answering based on on-device inference (using a closed model) or Apple's cloud (using a model you can't choose). My understanding is that you can ask OpenAI via an integration they've built, and that in the future you may be able to reach out to other hosted models. But I didn't see anything about being able to seamlessly reach out to your own locally-hosted models, either for Siri backup or anything else. But like I said, I'm not running the beta!