Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bot blocker not working for Apache/2.4.29 (Ubuntu) #139

Open
contactr2m opened this issue Dec 19, 2019 · 12 comments
Open

Bot blocker not working for Apache/2.4.29 (Ubuntu) #139

contactr2m opened this issue Dec 19, 2019 · 12 comments

Comments

@contactr2m
Copy link

Server version: Apache/2.4.29 (Ubuntu)
I tried following your given steps for Apache2.4 but when i try to test my site using curl, it seems bot blocker is not working.

Curl shows 301 may be because i have http redirect to https ?

curl -A "80legs" http://example.com
<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head>
<title>301 Moved Permanently</title>
</head><body>
<h1>Moved Permanently</h1>
<p>The document has moved <a href="https://example.com/">here</a>.</p>
<hr>
<address>Apache/2.4.29 (Ubuntu) Server at example.com Port 80</address>

But i i use curl -A "80legs" https://example.com
then entire page is loaded, instead of 403 ?

/etc/apache2# cat sites-enabled/000-default.conf

UseCanonicalName On

<VirtualHost *:80>
        ServerAdmin webmaster@localhost
        
        ServerName example.com
        ServerAlias www.example.com
        
        DocumentRoot /var/www/html

        <Directory /var/www/html/>
            Options FollowSymLinks
            AllowOverride All
            # Include /etc/apache2/custom.d/globalblacklist.conf
            Include custom.d/globalblacklist.conf
            Require all denied
        </Directory>

        ErrorLog ${APACHE_LOG_DIR}/error.log
        CustomLog ${APACHE_LOG_DIR}/access.log combined
RewriteEngine on
RewriteCond %{SERVER_NAME} =example.com [OR]
RewriteCond %{SERVER_NAME} =www.example.com
RewriteRule ^ https://%{SERVER_NAME}%{REQUEST_URI} [END,NE,R=permanent]
</VirtualHost>

sites-enabled/000-default-le-ssl.conf

<IfModule mod_ssl.c>
<VirtualHost *:443>
        ServerAdmin webmaster@localhost
        
        ServerName example.com
        ServerAlias www.example.com
        
        DocumentRoot /var/www/html

        <Directory /var/www/html/>
            Options FollowSymLinks
            AllowOverride All
            # Include /etc/apache2/custom.d/globalblacklist.conf
            Include custom.d/globalblacklist.conf
            Require all denied
        </Directory>

        ErrorLog ${APACHE_LOG_DIR}/error.log
        CustomLog ${APACHE_LOG_DIR}/access.log combined

Include /etc/letsencrypt/options-ssl-apache.conf
SSLCertificateFile /etc/letsencrypt/live/example.com/fullchain.pem
SSLCertificateKeyFile /etc/letsencrypt/live/example.com/privkey.pem
</VirtualHost>
</IfModule>
@ZerooCool
Copy link

ZerooCool commented Feb 25, 2020

I have the same, with :
Require all denied

If i add
Include custom.d/globalblacklist.conf

At the begin or at the end end of
<Directory /var/www/server-ip>
Require all denied

Then, my 403 doesn't work and i can see index.html

My Vhost : https://wiki.visionduweb.fr/index.php?title=VirtualHosts_des_domaines_enregistr%C3%A9s#139.99.173.195_.C3.A9coute_du_port_HTTP_80

@cafn
Copy link

cafn commented May 3, 2020

No response. I too am having the same problems. I just installed, actually updated to the newest version as of this date, and I can not get the test responses to reply as they are supposed to. My responses:

curl -A "googlebot" https://mydomain.com
Should respond with 200 OK, but instead I get no response on http://, https:// loads the page.

curl -A "80legs" https://mydomain.com
curl -A "masscan" https://mydomain.com
Should respond with 403 Forbidden, but instead I get no response on http://, https:// loads the page.

curl -I https://mydomain.com -e http://100dollars-seo.com
curl -I https://mydomain.com -e http://zx6.ru
Should respond with 403 Forbidden, but I get "HTTP/2 200" back on both http:// and https:// . The connection is made.

Granted the URL is forwarded to https:// so I did not expect to get a reply back there, but I did expect to get the expected results when using https:// also. This did not happen.
I installed the previous version about 18 months ago and it worked fine when installed, but I had not checked it since. I just re-downloaded the new files as per the instructions and left the vhost include statement as it was. This does not appear to be working, for me and I would really like for it to work. I have read the instructions a couple of additional times and made sure all files were in place.

Anybody got any ideas?

@ZerooCool
Copy link

ZerooCool commented May 5, 2020

To my log, i have the same for AspiegelBot
Should respond with 403 Forbidden (Or 302 redirect?), but I get "200".

curl -A "80legs" https://www.visionduweb.fr
<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head>
<title>302 Found</title>
</head><body>
<h1>Found</h1>
<p>The document has moved <a href="https://www.visionduweb.fr/403-forbidden.php">here</a>.</p>
</body></html>


curl -I https://www.visionduweb.fr -e http://100dollars-seo.com
HTTP/1.1 302 Found
Date: Tue, 05 May 2020 17:04:17 GMT
Server: Apache
Strict-Transport-Security: max-age=63072000; includeSubDomains; preload
X-Frame-Options: SAMEORIGIN
Referrer-Policy: no-referrer-when-downgrade
Feature-Policy: geolocation none;midi none;notifications none;push none;sync-xhr self;microphone none;camera none;magnetometer none;gyroscope none;speaker self;vibrate none;fullscreen self;payment none;
Location: https://www.visionduweb.fr/403-forbidden.php
Cache-Control: max-age=604800
Expires: Tue, 12 May 2020 17:04:17 GMT
Content-Type: text/html; charset=iso-8859-1


curl -A "AspiegelBot" https://www.visionduweb.fr
<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head>
<title>302 Found</title>
</head><body>
<h1>Found</h1>
<p>The document has moved <a href="https://www.visionduweb.fr/403-forbidden.php">here</a>.</p>
</body></html>


curl -A "AspiegelBot" -L https://www.visionduweb.fr
<!doctype html>
<html lang="fr">
<head>
<meta charset="utf-8"/>
</head>
<body>
<h1 style="text-align:center;">403 - Forbidden</h1>
<h2 style="text-align:center;">L'administrateur du serveur n'autorise pas la consultation de cette page.</h2>
<p style="text-align:center;"><img src="/images/structure/403-forbidden.jpg"/></p>
<p>Si vous pensez qu'il s'agit d'une erreur, merci de contacter l'administrateur en lui signifiant l'adresse URL de provenance.</p>
<p>Merci de transmettre des informations sur cette erreur, ainsi que la date et l'heure.</p>
<p>Le mail de l'administrateur : [email protected]</p>
<p><a href="https://www.visionduweb.fr">Retour sur la page d'accueil</a></p>
</body>
</html>

I have installed Bad Bot Blocker with this method : https://wiki.visionduweb.fr/index.php?title=Configurer_le_fichier_.htaccess#Bloquer_des_Bots_et_des_URL_ind.C3.A9sirables_avec_Bad_Bot_Blocker

My VirtualHost : https://wiki.visionduweb.fr/index.php?title=VirtualHosts_des_domaines_enregistr%C3%A9s

@ctlui
Copy link

ctlui commented May 15, 2020

I was getting same of you, so I try with v2.2 instructions and it's ok now.
About "AspiegelBot", he's not on the globalblacklist.conf as "zx6.ru"

@ZerooCool
Copy link

ZerooCool commented May 16, 2020

I have add AspiegelBot for my conf ;)

Where is the v2.2 instructions ? But, we have Apache 2.4.

For my last test, for AspiegelBot, i use the -L option for curl, then i can make the same with 80legs :
Strange, if i use curl -A .. -L .. the redirection work good.

The curl option -L is the good answer for you and me.

curl -A "80legs" -L https://www.visionduweb.fr
<!doctype html>
<html lang="fr">
<head>
<meta charset="utf-8"/>
</head>
<body>

<h1 style="text-align:center;">403 - Forbidden</h1>
<h2 style="text-align:center;">L'administrateur du serveur n'autorise pas la consultation de cette page.</h2>

<p style="text-align:center;"><img src="/images/structure/403-forbidden.jpg"/></p>

<p>Si vous pensez qu'il s'agit d'une erreur, merci de contacter l'administrateur en lui signifiant l'adresse URL de provenance.</p>
<p>Merci de transmettre des informations sur cette erreur, ainsi que la date et l'heure.</p>
<p>Le mail de l'administrateur : [email protected]</p>

<p><a href="https://www.visionduweb.fr">Retour sur la page d'accueil</a></p>
</body>
</html>

@dchmelik
Copy link

dchmelik commented Aug 4, 2020

Not just an Ubuntu problem, but general Apache 2.4 problem; at least on other POSIX-based OSs (such as stable/up-to-date Slackware GNU/Linux 14.2 w/Apache 2.4.43the same problem: setup instructions' tests for bad bots result in Apache serving pages fine (with normal 200 code.)

@mitchellkrogza
Copy link
Owner

mitchellkrogza commented Aug 4, 2020

The blocker does work and is tested across 2.2 up to 2.4.23 - see the build log and tests for yourself
https://travis-ci.org/github/mitchellkrogza/apache-ultimate-bad-bot-blocker

If you mess up your Apache 2.4 permission structure in any way at all higher up the chain you break everything below it including the blocker.

Travis CI enables your team to test and ship your apps with confidence. Easily sync your projects with Travis CI and you'll be testing your code in minutes.

@dchmelik
Copy link

dchmelik commented Aug 7, 2020

The blocker does work and is tested across 2.2 up to 2.4.23 - see the build log and tests for yourself [...]

Fine, but I stated same problem except w/more server-focused OSes: any such setup/test logs?

If you mess up your Apache 2.4 permission structure [...]

Unclear to me what that means for Apache... can anyone suggest documentation or elaborate how to debug?

@mitchellkrogza
Copy link
Owner

mitchellkrogza commented Aug 7, 2020

Start off by comparing your apache2.conf with the version used in tests - https://github.com/mitchellkrogza/apache-ultimate-bad-bot-blocker/blob/master/.dev-tools/_test_results/_conf_files_2.4/apache2.conf

Specifically these blocks

https://github.com/mitchellkrogza/apache-ultimate-bad-bot-blocker/blob/master/.dev-tools/_test_results/_conf_files_2.4/apache2.conf#L159-L180

GitHub
Apache Block Bad Bots, (Referer) Spam Referrer Blocker, Vulnerability Scanners, Malware, Adware, Ransomware, Malicious Sites, Wordpress Theme Detectors and Fail2Ban Jail for Repeat Offenders - mitc...

@mitchellkrogza
Copy link
Owner

mitchellkrogza commented Aug 7, 2020

Then also make sure your vhost config follows the same configuration for its main directory block
https://github.com/mitchellkrogza/apache-ultimate-bad-bot-blocker/blob/master/.dev-tools/_test_results/_conf_files_2.4/testsite.conf

GitHub
Apache Block Bad Bots, (Referer) Spam Referrer Blocker, Vulnerability Scanners, Malware, Adware, Ransomware, Malicious Sites, Wordpress Theme Detectors and Fail2Ban Jail for Repeat Offenders - mitc...

@skonesam
Copy link

The links above don't work, but found the tests anyway. Only (significant?) change I see is RewriteEngine On is in the test but not the instructions. It doesn't seem to change the outcome.

@amitash
Copy link

amitash commented Feb 6, 2023

I had the same issue but I was whitelisting my own IP, so everything was allowed. Testing from another machine worked well.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants