Best compromise between security and convenience


Given the recent complaints of multiple site exploits which can usually be traced down to multiple sites per user, I’ve taken my own advice more seriously and containerised the vast majority of my sites to roughly one app per user, each and every one with both enhanced user security (restrictive home directory permissions) and enhanced web security (mod_security enabled).

While this setup certainly helps contain any potential breech, it makes administration much more time consuming. I’m looking for suggestions to make administration of multiple sites easier.

Specifically, I’d like be able to:
[]run a single cronjob that can scan all sites for changes or anomalies
]access all site logs from a single user
[]avoid redundancies such as multiple copies of PEAR or GeoMind’s GeoIP database

This requires some sharing / file access between users. The only way I can think of to do this is to set up sort of a pseudo-root user which has access to all web-hosting users. The web-hosting users would also have limited access to some files of the pseudo-root user This user would
[]only be accessed rarely and only through SSH & public key authentication
]not host any websites so it would be immune from web-based exploits
[]have permissive home directory settings so that certain resources could be shared among websites with careful checks to avoid sharing files outside of the common group. This would include GeoMind’s GeoIP database which is updated monthly by a cronjob
]have a set of RSA keys (not protected by passwords) which allowed passwordless login to all web-hosting users so that cronjobs and other scripts could be set once to scan/update/whatever all web-hosting users’ logs and files

Does anyone have any ideas on this plan? Is it secure? Is there an easier way to do it? Is it worth it? What do you do if you follow best practices of one website/app per user? Do you have the same cronjobs running in each user (assuming your run daily intrusion checks)? How do you keep everything updated if you have many users/apps?


Setting Enhanced Security on an account locks it down (adm) - so you won’t be able to symlink even to a 777 dir between any enhanced account.

The only way I could see the single crontab scan idea working is if you have Support set one up for you using the adm account. They’d probably do it if you ask nicely, but make sure you get it right 'cause they might get sick of you bugging them to alter it.

The ‘pseudo root’ would work the other way around. Have a non-secure “resource” account and make shares available within it to all your secured accounts. You could use symlinks from the secured accounts to shared resources on the resource acc. Setup a crontab on each secured account to tar/gzip or copy their logfiles each day over/into the resource account. If you need them updated more often then rsync or similar on a 5m cron might be an idea.

Setting readability of files/dirs to a maximum of GROUP will keep any same-server prying eyes out.


Thanks for the advice. I think the non-secure resource account, rather than a pseudo-root. Having each secured user send out the files needed on a daily basis makes sense. I just didn’t want to set up a bunch of cronjobs, so I was thinking the other way around.

I was hoping that I could set up the cronjob to SSH to the secured account and temporarily mount it which would give the cronjob full access to the secured account. I don’t like the idea of having the private, unprotected keys on the server though, which is why I was hesitating.

I guess the best way is to do any scans, checks, verifications, etc as a cronjob in each secured user and export the results to the resource user. Then have a cronjob in that user collate the results and send them to me while also working as a shared repository to avoid redundancy.

Any other approaches? Or does everyone with multiple users just accept the fact that housekeeping and maintenance chores increase as more users are added to the stable?


Bobocat & sXi,

My friend Tessa came up with a brilliant method using a variant of this discussion: Making DB passwords for Wordpress and other packages unreadable by hackers on a hacked website. Sort of a last line of control. The concept is as follows:

  1. Create an DH FTP only account without enhanced security. Make sure all the other users have enhanced security.

  2. Create a directory on this ftp account (you have to have a directory or this doesn’t work)

  3. move wp-config from your website to this ftp account in this directory

  4. on your website create a wp-config file that ONLY has a PHP includes that points to the user/directory/wp-config of the FTP only user

  5. Make sure it works

  6. on the FTP only user chmod the wp-config file to 440, then chmod the ftp directory the file is in to 110.

  7. It should still work and your ‘include’ with your actual password is now in a separate ftp account with execute only privilege

I’m trying to get Tessa with write this up on in language everyone can understand, but I wanted to give you guys a heads up.

I’ve used it for both php includes and perl cnf files!



I use that approach for statics and commons.

With respect to WP… doesn’t WP Update require write permissions to the config?


It might, the files there, just not the passwords! I’ve been configured like this for a month and no issues…



I don’t use WP - although I did suffer an account intrusion due to leaving an old one-click lying around (that’ll teach me for being too lazy to clean up after myself lol). The reason I asked is that I do recall WP needing access to the config during an update process a few years ago. If memory serves me correctly it was due to implementation of some SALT-NONCE stuff.

It’s likely the case that it doesn’t write to config during updates unless it’s a major update (like that one was). Might be something to keep in mind or note down for anyone following your setup guidelines.

Hope you’re prepared for the users question onslaught you’ll probably receive :smiley:


If I weren’t so philosophically sure that protecting your WP password is a REALLY good thing, then no I wouldn’t put up with the onslaught. But, I really believe that this will make a difference. So over the next week or so I’ll document it and post it.


Great suggestion, I’ll try to tell people that a major upgrade may need to edit the whole file, not just a shell of a file.



I’m very happy. This thread has already been very productive. I’ve already started to implement some of these ideas and, save one minor problem I haven’t solved yet, it’s working well. I like the method of protecting DB logins as well. If I have time, I’ll update the wiki when I get it all working. This seems like a very good approach to increasing security on DH, so it should be documented. If anyone has the time (ha!), please help edit the wiki.

I wonder if DH shouldn’t be prodded to setup this sort of system of users by default…


Although that might appear to be a really good idea, it would not be “expected behaviour” to a non-customised script installation process, nor to an end user.

A wiki article outlining various ways to “harden” an account for the more security minded Dreamhosters amongst us is an excellent idea… and I’m all for the idea of you two blokes writing it :smiley:


Damn, I was thinking exactly the same thing!


Any idea which dreamhost wiki topic? I’ll try to write it. I’ll work on my friend to blog it, then I’ll try to add it in the appropriate space.


You could just start a topic, then link to it from the Security portal and the general security page. Something like hardening shared hosting accounts or shared hosting account management best practices or something along those lines might be appropriate.


Hey Bobocat,

[quote=“bobocat, post:13, topic:57163”]
You could just start a topic, then link to it from the… [/quote]

Thanks for the heads-up on this topic. Looks like I was on the right track just hated all the cron jobs, I have or my boss has 50+ domains, I have a few of my own and clients. The client ones aren’t so bad since they’re already one-app one account. But the 50+ domains are a bear…

Luckily it’ll be a write once, publish/tweak many operation.

All the domains are a mix of DreamWeaver, Joomla, Drupal, Wordpress. For the DreamWeaver sites we already do the publishing from a network backup drive (MySecureBackup) but as I told the boss even if we have a backup/devel version of the source on that drive for WP,Joomla and Drupal sites you still need to have a backup of the live data/database. One plugin update, Joomla install or article update and your network backup is useless.

Thanks again,


Exactly what I’d like to do. I’d love to have a central user that can reach into all other users so that I can use a single cron job to scan & monitor, but still retain the advantages of one app per user.

I like the ideas here regarding a common user for shared resources, and will set up such a system soon, but I’m still thinking of doing it the other way around as well. Probably the best way to do this is set up a single user with no web services, place that user’s public key in all other users’ authorized_users file, then use ssh-agent and a cron job so that user can ssh into each domain, run some checks, gather some stats, verify site integrity, etc, then close down the connection.

Is that the approach you’re going to take?


I’d suggest something like the DreamHost PS Hardening and Management only DreamHost Shared Hosting Hardening and Management then we could have links to Joomla/Drupal/Wordpress/OtherAppsHere specific topics in something like Joomla Hardening and Management

What do ya think?



I was leaning toward that approach and was considering using the approach outlined in the following post. But I have to admit I’m no Unix guru so I didn’t want to implement a less secure solution when security is the issue in the first place.

I like the SSH/passwordless solution cause I can still use that same code and just add the user@host to the rsync line.

I’ve been “lucky”, I guess, since I’ve only had one site compromised, but I’m pretty diligent about not accepting defaults, no one-click anything, update religiously etc… The single compromise happened after an uninstall of CiviCRM left a directory/files with 777 permissions.

Jw [hr]

I setup that sort of solution for a client just prior to when I wrote that previous thread and it works fine for a couple users. They only have three domains each running Joomla. So I created a sFTP/SSH only account, added a group, created the backups directory, chgrp, and added all the existing users to that new group. Then I gave the credentials to the network admin so he could off-site the archives. But it’s still three of the same cron jobs which rubs me the wrong way as a programmer. Definitely not DRY.



These ideas are actually working out quite nice. One advantage is being able to store config or rc files in the common user, which all other users can read, but setting the permissions so that only the owner can write to them. The owner can be a user which, like the common user, does not host any domains for security.

Now when I log into any user, the environments will be the same because they all share certain files. I guess cronjobs could similarly be set up to run a common bash script so that a single edit could change all of them!

Thanks everyone for the ideas.


would just moving the user / password lines from the WP-Config file to the secondary user also work if you include the user/password in the config.php ?

I would think that this should allow WP to update the config file if needed (unless wp wanted to change the variables for the User or Password itself)


Work for what purpose? It still needs to be readable, so it wouldn’t prevent an adversary from obtaining your DB credentials. It would make it easier for an admin to changed the credentials on many sites if they were all in one place.


I embarassed myself with this solution, there is a way to use include and bury the include in an execute only directory on the FTP only user, however, WP in their infinite wisdom made the password a global. :frowning: