top | item 46970338

(no title)

rubiquity | 19 days ago

The scrapers should use some discretion. There are some rather obvious optimizations. Content that is not changing is less likely to change in the future.

discuss

order

JohnTHaller|19 days ago

They don't care. It's the reason they ignore robots.txt and change up their useragents when you specifically block them.