top | item 32891575

(no title)

traek | 3 years ago

> It's somewhat surprising to hear that requests would be rejected if the user agent doesn't match a set of hard coded IP addresses.

It’s fairly common for DDoS/scraping prevention, Googlebot (and most other crawlers) publish their IP ranges for that reason[0][1][2]. I don’t work at Cloudflare though, so no insider knowledge of what you folks are doing.

[0] https://developers.google.com/search/docs/crawling-indexing/...

[1] https://developers.facebook.com/docs/sharing/webmasters/craw...

[2] https://developer.twitter.com/en/docs/twitter-for-websites/c...

discuss

order

No comments yet.