(no title)
Nab443
|
7 months ago
I tend to think that they mostly should be using their own user agent, and if not be desguised as the most common ones to avoid being detected too easily. Web scaping probably has been mostly running under Linux before the age of AI anyway. I'm not in the field, so if anyone more trustworthy info on that...
eloisant|7 months ago
Either way I don't think the 5% are impacted by scraping bots.