Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] "Double" 403 errors #142

Open
s22-tech opened this issue Feb 10, 2020 · 8 comments
Open

[BUG] "Double" 403 errors #142

s22-tech opened this issue Feb 10, 2020 · 8 comments

Comments

@s22-tech
Copy link

Describe the bug
Every time the Apache 2.4 blocker is activated, two 403's are generated in the error-log

To Reproduce
I can reproduce it myself by using the test commands, e.g.:
curl --head https://www.domain.com --referer semalt.com

Expected behavior
I would expect to get only one 403 per IP, but it also gives one for the 403.shtml file

Server (please complete the following information):

  • OS: CentOS 7
  • Apache/2.4.41 (cPanel) OpenSSL/1.1.1d mod_bwlimited/1.4 Phusion_Passenger/5.3.7 configured
  • Other Environments [CPanel - 84.0.21]
  • Any applicable error messages:
    [Mon Feb 10 2020] [authz_core:error] [client 1.2.3.4:50520] AH01630: client denied by server configuration: /home/user/public_html/, referer: semalt.com
    [Mon Feb 10 2020] [authz_core:error] [client 1.2.3.4:50520] AH01630: client denied by server configuration: /home/user/public_html/403.shtml, referer: semalt.com
    (Some info was removed/changed to shorten the entry here)

Additional information
The 403.shtml file exists and is reachable, so I don't think that error should be in the logs:
"client denied by server configuration: /home/user/public_html/403.shtml"

@s22-tech
Copy link
Author

I found either a solution or a workaround, I'm not sure which. I added the following block:

<LocationMatch ^(/php-fpm)?/errors/>
    Require         all granted
    DirectoryIndex  index.php
</LocationMatch>

just after:

# ######################################
# GLOBAL! deny bad bots and IP addresses
# ######################################
#
# Should be set after <VirtualHost>s see https://httpd.apache.org/docs/2.4/sections.html#merging
<Location "/">
	# AND - combine with preceding configuration sections.
	AuthMerging And
	# Include black list.
	Include custom.d/globalblacklist.conf
</Location>

and put my error files in that directory. That stopped the double 403 entries in the log.

I'm still not sure why this happened in the first place, but at least my error_log is half the size it was before.

Great script, by the way! I just hope I can solve this problem correctly.

@s22-tech
Copy link
Author

Well, there's another problem. robots.txt is also blocked by this and, of course, it shouldn't be. I'm beginning to think I may have installed this incorrectly.

Is there a way to open up and allow pages like robots.txt and 403.shtml to be allowed to everyone? That way, bots who would obey the robots.txt directives wouldn't keep trying to hit the site and clog up the error_log.

Thanks.

@mitchellkrogza
Copy link
Owner

Must be something else wrong with your setup. The blocker won't block robots.txt. I run this on both Apache and Nginx servers and the robots.txt handles the initial instruction to a bot, thereafter those that disobey robots.txt get caught by the blocker. Double log entries is also nothing to do with the blocker, that would simply be an error or duplication in your apache config.

@s22-tech
Copy link
Author

I added the following section to Apache's Post VirtualHost Include, since we were told not to mess with httpd.conf directly:

# ######################################
# GLOBAL! deny bad bots and IP addresses
# ######################################
#
# Should be set after <VirtualHost>s see https://httpd.apache.org/docs/2.4/sections.html#merging
<Location "/">
	# AND-combine with preceding configuration sections
	AuthMerging And
	# include black list
	Include custom.d/globalblacklist.conf
</Location>

Is that the correct file to add this to? Is there something else that needs to be done as well?

Thanks.

@mitchellkrogza
Copy link
Owner

Need to see the httpd.conf and vhost config's to be able to see whats going on.

@s22-tech
Copy link
Author

s22-tech commented Feb 15, 2020

Sorry - not sure what you mean by "vhost config". Are those the extra files that cPanel adds to httpd.conf - like post_virtualhost_global.conf?

Also, do you want me to post them here or send them via email?

@s22-tech
Copy link
Author

In the interest of saving time, I'll post what I have here. If it's not what you're asking for, let me know.

There's only one "extra" conf file that's populated - post_virtualhost_global.conf. There are no files in the listed IncludeOptional paths.

IncludeOptional /usr/local/apache/conf/sharedssl/*.conf
IncludeOptional /usr/local/apache/conf/sharedurl/*.conf

# Drop the Range header when more than 5 ranges.
# CVE-2011-3192
SetEnvIf Range (,.*?){5,} bad-range=1
RequestHeader unset Range env=bad-range

# Optional logging.
CustomLog logs/range-CVE-2011-3192.log common env=bad-range


# ######################################
# GLOBAL! deny bad bots and IP addresses
# ######################################
#
# Should be set after <VirtualHost>s see https://httpd.apache.org/docs/2.4/sections.html#merging

<Location "/">
	# AND - combine with preceding configuration sections.
	AuthMerging And
	# Include black list.
	Include custom.d/globalblacklist.conf
</Location>

<LocationMatch ^(/php-fpm)?/errors/>
    Require         all granted
    DirectoryIndex  index.php
</LocationMatch>

# Global robots.txt file for controlling crawlers.
<LocationMatch ^(/php-fpm)?/robots.txt>
    Require         all granted
    ProxyPass !
</LocationMatch>
#Alias /robots.txt /var/www/html/robots.txt
Alias /robots.txt /home/username/public_html/errors/robots_custom.txt

@s22-tech
Copy link
Author

I also see that useragents aren't being blocked. Have you had a chance to see what's wrong with this cPanel setup?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants