Given the recent complaints of multiple site exploits which can usually be traced down to multiple sites per user, I’ve taken my own advice more seriously and containerised the vast majority of my sites to roughly one app per user, each and every one with both enhanced user security (restrictive home directory permissions) and enhanced web security (mod_security enabled).
While this setup certainly helps contain any potential breech, it makes administration much more time consuming. I’m looking for suggestions to make administration of multiple sites easier.
Specifically, I’d like be able to:
run a single cronjob that can scan all sites for changes or anomalies
access all site logs from a single user
avoid redundancies such as multiple copies of PEAR or GeoMind’s GeoIP database
This requires some sharing / file access between users. The only way I can think of to do this is to set up sort of a pseudo-root user which has access to all web-hosting users. The web-hosting users would also have limited access to some files of the pseudo-root user This user would
only be accessed rarely and only through SSH & public key authentication
not host any websites so it would be immune from web-based exploits
have permissive home directory settings so that certain resources could be shared among websites with careful checks to avoid sharing files outside of the common group. This would include GeoMind’s GeoIP database which is updated monthly by a cronjob
have a set of RSA keys (not protected by passwords) which allowed passwordless login to all web-hosting users so that cronjobs and other scripts could be set once to scan/update/whatever all web-hosting users’ logs and files
Does anyone have any ideas on this plan? Is it secure? Is there an easier way to do it? Is it worth it? What do you do if you follow best practices of one website/app per user? Do you have the same cronjobs running in each user (assuming your run daily intrusion checks)? How do you keep everything updated if you have many users/apps?