Htaccess and 403

My htaccess file says:

ErrorDocument 403 /403.php

RewriteCond %{REQUEST_URI} users [NC]
RewriteRule ^(.*)$ 403.php?url=%{HTTP_HOST}%{REQUEST_URI} [R,L]

deny from 1.2.3.4 and several others

My 403.php file gets the IP and reports it and redirect the hacker. The RewriteRule works fine. My problem is if the IP is listed as “deny from” then the “ErrorDocument” loops to deny from to ErrorDoc to deny from - wrapped round the axle …

I can’t find a soultion for “if deny from match run 403.php”
Thanks

Untested, but you could try moving the 403.php to it’s own directory that has a .htaccess file containing only the directive Require all granted.

You need to allow your 403.php so blocked UAs don’t get looped back again.

RewriteCond %{REQUEST_URI} users [NC]
RewriteCond !^403.php$ 
RewriteRule ^(.*)$ 403.php?url=%{HTTP_HOST}%{REQUEST_URI} [R,L]

OK, so I tried both suggestions - no joy. It just doesn’t know to ignore the deny the second time around. So ErrorDocument 403 to.an.external.url ie google or … godhatesfags? or hmmm the source IP? or for right now my.home.server with the addition of: /fourohthree.php?ip=%{REMOTE_ADDR}&uri=%{SERVER_NAME} which sends me an email so when someone complains that they can’t get into one of my web pages I can fix it.
Understand I host several pages for local orgs, ie the pizza place and a hacker in China or NKorea isn’t going to order a pizza - the delivery charge … And I like to report hackers to their hosting companies.

You are doing it wrong.

The issue seems to be the script. If you are willing to loose the script and use simple blocking, allowing 403.php is easy.

Having that loop may be raising server load & possibly slowing the operation of the entire site.

^ This, basically.

A 403 is not the place to continue running code. The 403 is for ending a session while reporting a reason back to the client.

You say I’m doing it wrong, 403 should just terminate the connection. OK, how do you / what do you do to keep bad actors off of your sites? All of our sites/pages are for local use. We don’t use Wordpress but every day I see 2 to 10 requests for “wp-login.php”. So what do we do? Display a polite page explaining that we don’t have a WP page and that we’re very sorry - F! that. To start off with the odds are 99:1 it’s a bot that’s running IP, IP+1, IP+2, IP+3 looking to find a unlocked door. Our htaccess file has 153 “deny from X.0.0.0/8” lines. It started with none and has grown over time. So I’ll ask again: what do you do to keep bad actors off of your sites?

Here’s a post I wrote at another forum that lists the several methods of blocking:

https://www.webmasterworld.com/search_engine_spiders/4848539.htm

In practice, most of use use a comprehensive combination of several of the methods.

1 Like