top | item 47122334 (no title) delecti | 6 days ago I'm sure their crawler can handle a zip bomb. Plus it might interpret that as "this site doesn't have a robots.txt" and start scraping that OP is trying to prevent with their current robots.txt. discuss order hn newest marginalia_nu|6 days ago Pretty sure every crawler can. You kinda have to go out of your way not to, given how the gzread API looks.https://refspecs.linuxbase.org/LSB_3.0.0/LSB-Core-generic/LS... 1e1a|6 days ago Could allow only the path to the zip bomb for this user agent. FartyMcFarter|6 days ago That will work once at most and then quickly get fixed. load replies (2)
marginalia_nu|6 days ago Pretty sure every crawler can. You kinda have to go out of your way not to, given how the gzread API looks.https://refspecs.linuxbase.org/LSB_3.0.0/LSB-Core-generic/LS...
1e1a|6 days ago Could allow only the path to the zip bomb for this user agent. FartyMcFarter|6 days ago That will work once at most and then quickly get fixed. load replies (2)
marginalia_nu|6 days ago
https://refspecs.linuxbase.org/LSB_3.0.0/LSB-Core-generic/LS...
1e1a|6 days ago
FartyMcFarter|6 days ago