boa
boa copied to clipboard
Make the crawler protection optional / enabled by default
One of my clients SEO firm is complaining that their crawler has been returning 403 errors for the past few weeks. When I check the Provision repository, it seems SEMrush was blocked in an commit on 18 february.
Is there any information on why it was blocked, that I can take back to my client or SEMrush?
If any bot is blacklisted, it means that we have received enough reports about system instability caused by too aggressive bot/crawler. You can remove the block from your system, if you wish, but it will get overwritten when you will run barracuda with Aegir master instance upgrade again.
We should probably make this protection optional and enabled by default.
Thanks for the response. That's basically what I was expecting based on #784 I'm going to give SEMrush a chance to fix their issues, before opening the flood gates when this feature is implemented.