(no title)
rozab | 18 days ago
I added a robots.txt with explicit UAs for known scrapers (they seem to ignore wildcards), and after a few days the traffic died down completely and I've had no problem since.
Git frontends are basically a tarpit so are uniquely vulnerable to this, but I wonder if these folks actually tried a good robots.txt? I know it's wrong that they ignore wildcards, but it does seem to solve the issue
trillic|18 days ago
stefanka|18 days ago
dmit|18 days ago
skrtskrt|18 days ago
bob1029|18 days ago
I suspect that some of these folks are not interested in a proper solution. Being able to vaguely claim that the AI boogeyman is oppressing us has turned into quite the pastime.
embedding-shape|18 days ago
FWIW, you're literally in a comment thread where GP (me!) says "don't understand what the big issue is"...