Blocking repetive requests?


#1

Occasionally, my resource spike because of a few thousand requests from Russian and other foriegn IP numbers. I have no info about the requests (not in my log files for the time being).

I’m sure people are aware of blog comment spamming, forum spamming, etc. Which are automated requests (in my case - one site with over 175,000 requests from the same host).

I have been able to block requests that match the following conditions using mod_rewrite:

no user agent + no referrer + requesting “/”

This is not an ideal way to block repetitive requests, since they can easily bypass it by providing a user-agent.

I’ve seen some examples of blocking ips in an external file referenced from .htaccess, however generating a dynamic list in the file without having to manually add them is a bit more difficult.

Has anyone solved this issue?

My general conditions would be something like:

  • 15% of traffic from the same ip
  • repetitive requests within a given time frame (ie: less than 1 second) for the same file.
  • more than 1,000 requests from the same ip within 60 seconds

These are rough estimates, I just need to figure out how to generate this data from the log files automatically.

Please advise.