Setting proper permissions


#1

How do I go about preventing source disclosure or preventing the world from downloading certain files? e.g. SSI files stored in the web dir, or ‘sensitive’ files (files one typically wants to only make them available to the web application).

I am trying to discourage the snoops, the script kiddies out there, as well as the lazy buggers who misuse ‘web site capturing’ apps to steal other people’s work.

All help/advice will be appreciated.

-Marsbar


#2

add this to an .htaccess file in your web root:

Options -indexes

and it will stop directory listing, that is listing files in a directory when there is no default file present to load (such as “index.html”).

“web site capturing” apps only grab rendered pages so the only source they’d be getting there is (x)html. There is no real way (or reason imho) of concealing your markup, there are a few server side things out there that I’ve heard of but I’ve never cared enought to look.

Google showed this page, most of these are laughably easy to get around:
http://www.devwebpro.com/devwebpro-39-20030428Is-Someone-Stealing-Your-Source-Code.html

[color=#0000CC]jason[/color]


#3

I don’t think there is a 100% foolproof way. Browsers like Firebird list all links, files etc. used by a page, right down to external stylesheets and form actions. I’ve even tried page encryption software (encrypts the entire page), and it still shows in Firebird.

I don’t know the first thing about more advanced scripting, or things like ASP. But if you look at major business sites, they seem to be able to serve files without giving away too much info. Even so, it seems that if a page appears in your browser, anything that has anything to do with it is pretty easily accessible.

~Michelle


#4

Many thanks for your responses, Jason and Michelle.

I did not explain my problem and situation very clearly in my previous post.

I agree with Jason that there is no real reason for concealing one’s markup; and I agree with Michelle that there is no easy way to prevent styelsheets from being downloaded. No, I do not worry about hiding my design or stylesheets, because I realise people can view all when they download and/or study the source code of my pages. So, I guess the first example I gave in my previous post was a bad one. :-p

I have already disabled directory listing using .htaccess. But I think it is through file permissions that I ensure my scripts and certain config files that need to stay in the web directory* can only be read/run by the server. Setting correct permissions is what I need advice on. My config files (in php format) contain sensitive data that I would like to protect from prying eyes. How do I go about in making sure that those files are protected from prying eyes.

And can html templates used by certain scripts be protected against web capturing apps?

*Need to stay in the web dir?
Perhaps the config files could all be relocated outside the web directory? But how do my pages and/or scripts call those files outside the web directory?

-Marsbar


#5

CHMOD to 700 is read/write/execute for owner only

to reference something outside your web root use the full path (/home/user/hiddenfilesdir/ when your webroot is /home/user/domain.tld)

web capturing apps will not grab the templates, but they will grab the final, assembled, output in plain html

[color=#0000CC]jason[/color]


#6

Terrific! Thanks for your help, Jason.
-Marsbar


#7

In certain cases, you may be able to use .htaccess files to prevent a user from reading the file. Of course if the user’s browser (as opposed to the server) NEEDS to download a file, there’s a lot less you can do.

Here’s an example which would prevent files starting with “.ht” from being downloaded.

<Files ~ “^.ht”>
Order allow,deny
Deny from all

In the case of scripts / programs which are executed by the server, the user should only be able to download the output of the script, rather than the script itself.

Not sure if this totally answers your question, but hope it helps.


#8

Many thanks, Will, for your helpful example. I have updated my .htaccess file, and I believe it is working as intended (i.e. protected files remain readable by the server, but not by the users’ browser).
-Marsbar


#9

Sorry for bringing this old topic up again… but dumb old marsbar needs some reassurance.

According to the server logs, a certain machine had downloaded heaps of files from my site (eating up some bandwidth) THREE times in the past two days - twice using WebCopier and once using Sitesnagger.

I can understand if one wanted to browse a site offline, but why would one need to download the same files three times using two different site copiers in two days? Perhaps one was after a particular file from my site? Hmm… do I sound paranoid now? I could not go through each line in the log to find out what had been downloaded and what hadn’t. I just want to make sure script config files (mostly php files that contain db access details) are safe from prying eyes. What is the best way to check my files are safe? (No, I do not mean run webcopier and see what it can download :-))

  • marsbar

#10

[quote]I could not go through each line in the log to find out what had
been downloaded and what hadn’t.

[/quote]

One way to do this would be to search for the browser agent using grep. For example:

grep “FooScraper” logs/yourdomain.com/http/access.log

(…assuming your domain is “yourdomain.com”, you’re looking for accessed by a client that identifies itself as “FooScraper”, and the scrapings happened that day - there are older log files in that same directory you can look inside of, too, and you can even view failed requests inside of error.log)

[quote]I just want to make sure script config files (mostly php files that
contain db access details) are safe from prying eyes. What is the
best way to check my files are safe?

[/quote]

Honestly, the best way to do it is to simply try viewing those files using a web browser. If you can’t get to the information in question, neither can a scraper (which are really just ‘dumb’ web clients, and have no more ability to view inaccessible files than any other).

  • Jeff @ DreamHost
  • DH Discussion Forum Admin

#11

Many thanks for your helpful advice, Jeff. I have checked the logs from the past few days. The scraper did not take anything I did not want to share with the public. Phew. :slight_smile:

  • marsbar