No one should be going to certain URLs unless they are looking for weaknesses in your site - ban first, ask questions later.
With this, if a visitor visits a specific URL they get banned, permanently. Via .htaccess file in 403 denied style. You also get emailed the info.
Be aware that this will add IPs to the end of your .htaccess file, after a few months it may have around 100 entries! You might want to clean it out.
You need -
write enable your .htaccess file (666 CHOWN, do this via webFTP from control panel), NOT 777.
a .php file (e.g… banjerks.php) which will do the ban whenever a user visits it (if you ban yourself you need to remove your own IP from the .htaccess file via FTP or webFTP to access your own site!)
note: the lines for cookie1 and cookie2 are just what I use to detect my regular visitors, if they “accidentally” got banned I would probably know - just a safeguard, remove the lines if you don’t know what your cookie names are or don’t care, otherwise EDIT those two lines.
banjerks.php in main folder
$ip = “deny from $REMOTE_ADDR\n” ;
$banip = ‘.htaccess’;
$fp = fopen($banip, “a”);
$write = fputs($fp, $ip);
//@ symbol hides errors from visitors
‘Banned IP ‘.$_SERVER[‘REMOTE_ADDR’].
’ at ‘.$_SERVER[‘HTTP_REFERER’],
’ IP ‘.$_SERVER[‘REMOTE_ADDR’].’ banned’.
’ request URI ‘.$_SERVER[‘REQUEST_URI’].
’ referrer ‘.$_SERVER[‘HTTP_REFERER’].
’ agent ‘.$_SERVER[‘HTTP_USER_AGENT’].
’ cookie1 ‘.$_COOKIE[‘somecookie’].
’ cookie2 '.$_COOKIE[‘someothercookieid’]);
?>[/code]now for the .htaccess parts - We do some internal URL rewriting so that visitors bot crawlers wont be able to avoid it.
You can add more RewriteRule if you want, mine has 8 commonly crawled URLs that bots use to look for weaknesses.
.htaccess file in root folder, permission 666
ErrorDocument 403 "
403Your IP is banned or file is forbidden.[code]Options FollowSymLinks
RewriteRule ^cgi-bin/formmail.pl /ban.php
RewriteRule ^cgi-bin/formmail.cgi /ban.php
RewriteRule ^cgi-bin/FormMail.pl /ban.php
RewriteRule ^cgi-bin/FormMail.cgi /ban.php
RewriteRule ^cgi-bin/formail.cgi /ban.php
RewriteRule ^formmail.php /ban.php
RewriteRule ^_vti_inf.html /ban.php
RewriteRule ^_vti_pvt /ban.php
Now, what I do is put a hidden link (make sure it is not clicked on by regular visitors!) in my main pages as the first or second link and list that same link in the robots.txt file as an ‘excluded link’ - EVIL crawlers and website downloading programs that ignore robots.txt will get banned too, so might pre-fetching plugins . Crawl Delay just makes some search engines do their job less often.
In combination with this script and some modifications from here you can ban anyone who visits too often (say more than 200 visits in 30 seconds would not likely be someone you’d want viewing your site, depending on what it is about I guess)