Christian Folini
Christian Folini
Agreed. Let's close this.
OK. We're catching this at PL2 as is. Catching it at PL1 would have been cool, but there is no pressing need (as we catch it at PL2). We're unsure...
I'm pondering overhauling the entire UA lists. This would be covered too. Let's keep it open for a moment.
OK, here we go. ### Status Quo / Existing rules ``` 913110 PL1 critical scanners-headers.data 8 entries issue:#2647 913120 PL1 critical scanners-urls.data 17 entries issue:#2648 913100 PL1 critical scanners-user-agents.data 93...
Thank you for the confirmation. I have meanwhile dropped the "There is a certain risk ..." It's been an edit error. No additional thought in that broken sentence. I am...
Simon Bennetts responded and the response is not very comforting: ``` You would probably have to run them and check what they send by default :/ And you should probably...
Update: I have a minimal version covering "Apache ultimate bad bot list" and "Crawler Detect" - 1731 entries - and deployed it on netnea.com for testing purposes. No exclusions yet,...
Still no exclusions, but this monster comes up with a list of >2K user agents: ``` CURL_OPTIONS="--silent" ( curl $CURL_OPTIONS https://raw.githubusercontent.com/mitchellkrogza/apache-ultimate-bad-bot-blocker/master/Apache_2.4/custom.d/globalblacklist.conf -s | \ grep BrowserMatchNoCase | \ grep bad_bot...
For the record: I picked up work on this again. It's just a lot of work, so it takes time.
Interesting list of "verified" bots that could be used to govern the exceptions to the lists posted above: https://radar.cloudflare.com/traffic/verified-bots