top | item 45053427

(no title)

r1ch | 6 months ago

At first they were easily detectable using HTTP header analysis - e.g. pretending to be Chrome but not sending the headers that Chrome always sends. Now it's a combination of TLS / HTTP protocol level analysis and application layer - e.g. we send a cookie on the user's "normal" page view and check it exists on the higher-resource usage pages they might later visit - the bots don't care about normal viewing patterns and try to hit the higher-resource pages on their first visit, so they get blocked.

discuss

order

No comments yet.