I’ve been collecting a database of ip addresses which are doing all forms of badness to my wordpress installs. I’ve written a perl script which which post processes logs from a few of my sites and looks at people who are looking for stuff that I don’t have (and have never had), I’ve set a few traps, like telling robots.txt to ignore /blackhole and I’ve watched lots of bots fall in, and I’ve had lots of bots try to hack wordpress, or search for compromised plugins. The program writes the bad players to an MYSQL database.
The database has just shy of 1,600 entries. My question is if I created a 1,600 .htaccess file denying each individually, how much overhead would this cause apache? Would I notice it?
The reason this process is futile is this is whack-a-mole, I’m identifying lots of potential compromised PC’s which may be cleaned up in the future and offer no risk post cleanup, but are dangerous when controlled by the underlying people who are bad.