top | item 40660410

(no title)

noahtallen | 1 year ago

> But it seems the goal now is just "Make it an LLM," instead of focusing on recognizing the task that the user wants to do, and connecting it to APIs that can do those tasks.

I almost completely agreed with you, but this is not true! Apple is trying to solve the task & API problem with “task intents”, on which they go into more detail outside of the keynote: https://youtu.be/Lb89T7ybCBE

The new Siri models are trained on a large number of schemas. Apps can implement those schemas to say “I provide this action” (aka, the user intends to do this action). Siri can use the more advanced NLP that comes with GenAI to match what you say to a schema, and send that to an app.

These app intents are also available to spotlight and shortcuts, making them more powerful than just being Siri actions

discuss

order

apeace|1 year ago

Wow, that's great to hear! Excited to see what comes of it.