(no title)
cairnechou | 3 months ago
A common protocol—like a robots.txt but for AI agents—feels like the inevitable and necessary next step. We need a way for sites to "semantically" declare their functions and content to a machine.
This raises a huge game-theory question, though: What is the website's incentive to adopt this?
It's a "cooperate or defect" dilemma. If a site doesn't cooperate, the AI agent will just scrape it (badly). If it does cooperate, it makes it easier for the AI agent to summarize its content and potentially bypass its ad/conversion funnels.
I'm curious what the authors think the "win-win" is here for the websites themselves.
vinibrito|3 months ago
No need for new protocol or anything like it, just a convention, as you mentioned, similar to robot.txt.
Perhaps aidata.json.
cairnechou|3 months ago
A simple convention like aidata.json is perfect. That's the "win-win" I was looking for: the site gets to clearly offer what it wants the AI to see, and the AI gets clean data instead of having to guess at brittle HTML.
aidata.json is a great name for it.