apache-ultimate-bad-bot-blocker icon indicating copy to clipboard operation
apache-ultimate-bad-bot-blocker copied to clipboard

[BUG] "Double" 403 errors

Open s22-tech opened this issue 5 years ago • 8 comments

Describe the bug Every time the Apache 2.4 blocker is activated, two 403's are generated in the error-log

To Reproduce I can reproduce it myself by using the test commands, e.g.: curl --head https://www.domain.com --referer semalt.com

Expected behavior I would expect to get only one 403 per IP, but it also gives one for the 403.shtml file

Server (please complete the following information):

  • OS: CentOS 7
  • Apache/2.4.41 (cPanel) OpenSSL/1.1.1d mod_bwlimited/1.4 Phusion_Passenger/5.3.7 configured
  • Other Environments [CPanel - 84.0.21]
  • Any applicable error messages: [Mon Feb 10 2020] [authz_core:error] [client 1.2.3.4:50520] AH01630: client denied by server configuration: /home/user/public_html/, referer: semalt.com [Mon Feb 10 2020] [authz_core:error] [client 1.2.3.4:50520] AH01630: client denied by server configuration: /home/user/public_html/403.shtml, referer: semalt.com (Some info was removed/changed to shorten the entry here)

Additional information The 403.shtml file exists and is reachable, so I don't think that error should be in the logs: "client denied by server configuration: /home/user/public_html/403.shtml"

s22-tech avatar Feb 10 '20 19:02 s22-tech

I found either a solution or a workaround, I'm not sure which. I added the following block:

<LocationMatch ^(/php-fpm)?/errors/>
    Require         all granted
    DirectoryIndex  index.php
</LocationMatch>

just after:

# ######################################
# GLOBAL! deny bad bots and IP addresses
# ######################################
#
# Should be set after <VirtualHost>s see https://httpd.apache.org/docs/2.4/sections.html#merging
<Location "/">
	# AND - combine with preceding configuration sections.
	AuthMerging And
	# Include black list.
	Include custom.d/globalblacklist.conf
</Location>

and put my error files in that directory. That stopped the double 403 entries in the log.

I'm still not sure why this happened in the first place, but at least my error_log is half the size it was before.

Great script, by the way! I just hope I can solve this problem correctly.

s22-tech avatar Feb 12 '20 21:02 s22-tech

Well, there's another problem. robots.txt is also blocked by this and, of course, it shouldn't be. I'm beginning to think I may have installed this incorrectly.

Is there a way to open up and allow pages like robots.txt and 403.shtml to be allowed to everyone? That way, bots who would obey the robots.txt directives wouldn't keep trying to hit the site and clog up the error_log.

Thanks.

s22-tech avatar Feb 13 '20 17:02 s22-tech

Must be something else wrong with your setup. The blocker won't block robots.txt. I run this on both Apache and Nginx servers and the robots.txt handles the initial instruction to a bot, thereafter those that disobey robots.txt get caught by the blocker. Double log entries is also nothing to do with the blocker, that would simply be an error or duplication in your apache config.

mitchellkrogza avatar Feb 14 '20 08:02 mitchellkrogza

I added the following section to Apache's Post VirtualHost Include, since we were told not to mess with httpd.conf directly:

# ######################################
# GLOBAL! deny bad bots and IP addresses
# ######################################
#
# Should be set after <VirtualHost>s see https://httpd.apache.org/docs/2.4/sections.html#merging
<Location "/">
	# AND-combine with preceding configuration sections
	AuthMerging And
	# include black list
	Include custom.d/globalblacklist.conf
</Location>

Is that the correct file to add this to? Is there something else that needs to be done as well?

Thanks.

s22-tech avatar Feb 14 '20 16:02 s22-tech

Need to see the httpd.conf and vhost config's to be able to see whats going on.

mitchellkrogza avatar Feb 15 '20 07:02 mitchellkrogza

Sorry - not sure what you mean by "vhost config". Are those the extra files that cPanel adds to httpd.conf - like post_virtualhost_global.conf?

Also, do you want me to post them here or send them via email?

s22-tech avatar Feb 15 '20 13:02 s22-tech

In the interest of saving time, I'll post what I have here. If it's not what you're asking for, let me know.

There's only one "extra" conf file that's populated - post_virtualhost_global.conf. There are no files in the listed IncludeOptional paths.

IncludeOptional /usr/local/apache/conf/sharedssl/*.conf
IncludeOptional /usr/local/apache/conf/sharedurl/*.conf

# Drop the Range header when more than 5 ranges.
# CVE-2011-3192
SetEnvIf Range (,.*?){5,} bad-range=1
RequestHeader unset Range env=bad-range

# Optional logging.
CustomLog logs/range-CVE-2011-3192.log common env=bad-range


# ######################################
# GLOBAL! deny bad bots and IP addresses
# ######################################
#
# Should be set after <VirtualHost>s see https://httpd.apache.org/docs/2.4/sections.html#merging

<Location "/">
	# AND - combine with preceding configuration sections.
	AuthMerging And
	# Include black list.
	Include custom.d/globalblacklist.conf
</Location>

<LocationMatch ^(/php-fpm)?/errors/>
    Require         all granted
    DirectoryIndex  index.php
</LocationMatch>

# Global robots.txt file for controlling crawlers.
<LocationMatch ^(/php-fpm)?/robots.txt>
    Require         all granted
    ProxyPass !
</LocationMatch>
#Alias /robots.txt /var/www/html/robots.txt
Alias /robots.txt /home/username/public_html/errors/robots_custom.txt

s22-tech avatar Feb 15 '20 21:02 s22-tech

I also see that useragents aren't being blocked. Have you had a chance to see what's wrong with this cPanel setup?

s22-tech avatar Feb 21 '20 15:02 s22-tech