(no title)
OsmanDKitay | 6 months ago
For example, aura manifest says the create_post capability needs auth. If an agent ignores that and POSTs to /api/posts without a valid cookie, our server's API will reject it with a 401. The manifest doesnt do the blocking, the backend does. It just tells the cooperative agents the rules of the road ahead of time.
So the real incentive for an agent to use Aura isnt about avoiding punishment, it s about the huge upside in efficiency and reliability. Why scrape a page and guess at DOM elements when you can make a single, clean API call that you know will work? It saves the agent developer time, compute resources, and the headache of maintaining brittle scrapers.
So;
robots.txt tells good bots what they shouldn't do.
aura.json tells them what they can do, and gives them the most efficient way to do it, all backed by the server's actual security logic.
JohnFen|6 months ago
In that light, I guess your proposal makes a certain amount of sense. I don't think it addresses what a lot of web sites want, but that's not necessarily a bad thing. Own your niche.
OsmanDKitay|6 months ago
paulryanrogers|6 months ago