URL based IP ban using PHP and htaccess, automated

apps

#1

No one should be going to certain URLs unless they are looking for weaknesses in your site - ban first, ask questions later.

With this, if a visitor visits a specific URL they get banned, permanently. Via .htaccess file in 403 denied style. You also get emailed the info.

Be aware that this will add IPs to the end of your .htaccess file, after a few months it may have around 100 entries! You might want to clean it out.

You need -
write enable your .htaccess file (666 CHOWN, do this via webFTP from control panel), NOT 777.
a .php file (e.g… banjerks.php) which will do the ban whenever a user visits it (if you ban yourself you need to remove your own IP from the .htaccess file via FTP or webFTP to access your own site!)

note: the lines for cookie1 and cookie2 are just what I use to detect my regular visitors, if they “accidentally” got banned I would probably know - just a safeguard, remove the lines if you don’t know what your cookie names are or don’t care, otherwise EDIT those two lines.

banjerks.php in main folder

[code]<?php

$ip = “deny from $REMOTE_ADDR\n” ;
$banip = ‘.htaccess’;
$fp = fopen($banip, “a”);
$write = fputs($fp, $ip);
fclose($fp);

//@ symbol hides errors from visitors
@mail(‘you@yourdmain.whatever’,
‘Banned IP ‘.$_SERVER[‘REMOTE_ADDR’].
’ at ‘.$_SERVER[‘HTTP_REFERER’],
’ IP ‘.$_SERVER[‘REMOTE_ADDR’].’ banned’.
’ request URI ‘.$_SERVER[‘REQUEST_URI’].
’ referrer ‘.$_SERVER[‘HTTP_REFERER’].
’ agent ‘.$_SERVER[‘HTTP_USER_AGENT’].
’ cookie1 ‘.$_COOKIE[‘somecookie’].
’ cookie2 '.$_COOKIE[‘someothercookieid’]);
?>[/code]now for the .htaccess parts - We do some internal URL rewriting so that visitors bot crawlers wont be able to avoid it.

You can add more RewriteRule if you want, mine has 8 commonly crawled URLs that bots use to look for weaknesses.

.htaccess file in root folder, permission 666

ErrorDocument 403 "

403

Your IP is banned or file is forbidden.[code]Options FollowSymLinks
RewriteEngine on
RewriteRule ^cgi-bin/formmail.pl /ban.php
RewriteRule ^cgi-bin/formmail.cgi /ban.php
RewriteRule ^cgi-bin/FormMail.pl /ban.php
RewriteRule ^cgi-bin/FormMail.cgi /ban.php
RewriteRule ^cgi-bin/formail.cgi /ban.php
RewriteRule ^formmail.php /ban.php
RewriteRule ^_vti_inf.html /ban.php
RewriteRule ^_vti_pvt /ban.php
Order Deny,Allow

[/code]
Now, what I do is put a hidden link (make sure it is not clicked on by regular visitors!) in my main pages as the first or second link and list that same link in the robots.txt file as an ‘excluded link’ - EVIL crawlers and website downloading programs that ignore robots.txt will get banned too, so might pre-fetching plugins :wink: . Crawl Delay just makes some search engines do their job less often.

robots.txt

User-agent: * Disallow: /cgi-bin/ Disallow: banjerk.php Disallow: /banjerk.php Crawl-delay: 60
In combination with this script and some modifications from here you can ban anyone who visits too often (say more than 200 visits in 30 seconds would not likely be someone you’d want viewing your site, depending on what it is about I guess)


#2

haha, You’re plagued by the FormMail bot looking for an exploitable script to send their spam, too? :slight_smile:

In the past, I added their IP to a list and did similar to what you did. But after a bit, it got old. I just stopped doing. I make sure I have no FormMail.pl script anywhere and left it at that. It’s just a bot, checking once and it won’t return ever again. Pointless in banning it.


yerba# rm -rf /etc
yerba#


#3

Maybe, but it is still fun! And you end up with a log of all the IPs that did it, which may come in handy one day.

Imagine that someone decides they don’t like your site and they start trying some basic tricks to cause you grief, you win round 1, match cancelled.