How to limit access to Perl programs

I have a bunch of Perl programs sitting in a directory under my website root. Some I wrote for learning and testing Perl, while some are actually used on the website. I’ve noticed that someone unauthorized is running those Perl programs because I get email from the ones in which I was learning about sendmail. I’m guessing that someone stumbled onto this directory by guessing the most obvious name for CGI programs, and running the programs listed in their browser window to see what they can learn.

What’s the standard practice for protecting this directory from hacking attempts. I was thinking that a simple fix would be to put in a little index.htm file in the directory to prevent listing the contents in browser, although that listing has been helpful for me at times. Are there pass wording solutions or file permission settings that I should be using?

This signature line intentionally blank.

The index.html file, or changing the directory listing settings via .htaccess, is not really any protection at all in this circumstance because, as you noticed, the files can still be run.

The perfect way to deal with your situation is to employ apache authentication for the directory housing the files. This can be done manually from the shell, or “automagically” via the Control Panel -> Goodies -> htaccess/Webdav screen. If using the Control Panel, make sure only to set the password/user info for .htaccess and not make the directory a Webdav dir. :wink:

Once done, not only is listing prevented, but also running the scripts is prevented until the proper username/password is entered from the browser (once per session). Good Luck!


If I used .htaccess, then wouldn’t you get a authenification challenge when accessing any of the scripts in the directory from links on my pages? Like, in the nav bar, there is Search Publications. If you clicked on that, you’d have to know the password to search?

A problem I see is when you even hover the mouse over the publications, you see the whole path to the, then you might think about going one step up the tree and see what else there is in there that will run. There’s nothing there that can do any damage I think, but I’d better check. I was thinking an index.htm at least would prevent a listing of all the possible files to try some mischief with. With a index.htm, you’d have to guess at the file names. You would probably be correct if you guessed or some of the script names out of some of the popular text books, but it would make it harder. I could also chmod -x all the script versions that are not currently being used.

This signature line intentionally blank.

Yes, you would…but I guess I misunderstood your statement about not wanting “unauthorized” running of the scripts (I assumed that you wanted to protect them from being run by the general public). Unless you want to test for the authorized status of a user before running the script, .apache authentication is probably not what you want to use.

So, if I understand better now what you are asking, you don’t so much want to prevent the “public” from running some of the scripts, just certain of the (ones not linked from your site?). Is that correct?


Yes, that’s pretty much it. I’d like my website to run seamlessly between pages that are html and the html links that call some perl program. Maybe I need to be more organized and keep test and training scripts in some other directory. Do most folks avoid such obvious directory names as cgi-bin? Is this a security problem in itself?

This signature line intentionally blank.

Actually, that is what I do myself, and what I suggest you consider. :wink: . Other wise, I supposes there is some “security by obscurity” aspect of just suppressing the directory listing, but not much point to that, IMHO. People will browse “up-tree” as you pointed out, and that can put one, albeit a small one, obstacle to running those “test” scripts in their path.

Of course, you can always just suppress the directory listing by placing an .htaccess file containing the directive:

Options -Indexes

into that directory, but using an index.html gives you a little more flexibility (for instance, you could put a meta refresh line into the index.html file to immediately re-direct them to you home page!).

I think that is very much a matter of personal opinion. On DreamHost, you don’t really need a cgi-bin, though it does make it easier sometimes to install packages that expect certain thngs to be in a “cgi-bin” directory (you don’t have to edit as much code to change paths and all). :wink:

That said, I don’t think there is anything inherently “unsafe” about having a commonly named directory - just make sure all the permission for you directories and files are set with necessary security in mind. Again, the “security by obscurity” concept is what we are talking about here, and I just don’t think there is enough advantage in that to make it worth doing, though YMMV!