top | item 40759793

(no title)

astine | 1 year ago

I agree. Robots.txt is a suitable means of preventing crawlers from accidentally DOSing your site, but it doesn't really give you any protections as to how your content is used by automated services. The current anything-goes approach is just too exploitable.

discuss

order

No comments yet.