Somehow you haven’t quite understood democracy yet. He is your president, even if you voted against him – just as you are subject to his administration, even though it may not like the fact that it has to cater to all Americans, not just 50%.
The issue is that Dreamhost, in its default setup, uses PHP-CGI. That means that for every PHP request served, a new PHP interpreter has to be invoked, the script compiled, and executed. The Wiki says that this is not slower than mod_php or such, but sorry, that is just pure and utter bullshit
You can’t use the mod_php either (it is active at Dreamhost, but obviously with safe_mode and all kinds of other broken PHP “security”. This is not a fault of DH, but of the ill-conceived security approach safe_mode provides).
You can, however, use PHP-FCGI with eAccelerator or such. That way the PHP interpreter will be persistent (no startup or teardown costs worth mentioning), the scripts will all just be compiled once and cached, and the site will be fast. There are tutorials on the wiki for this. If you cannot use mod_php (and with Gallery you can’t), this is the way to go. Even if you can use mod_php, PHP-FCGI may offer benefits due to your ability to use eAccellerator or other such Caches.
2. digg - Almost anyone is going to have serious problems if their site gets seriously dugg or slashdotted. If it happens you have to really stay on top and make sure the page in question gets cached or is turned into a static page.[/quote]
Or write the code in a way which does this automatically. Granted, that is hard with AJAX or such, but I generally write my dynamic sites that have a potential to be hit by storms of traffic in a way that either makes a manifold increase in hits negligible (there is no reason for content to be recreated for every page hit in most cases; often it is enough to just update it every 5 minutes server-side, or even on-demand and cache it. For instance, with some mod_rewrite magic, you can check for a cached static version of a page or some content easily (file existence check), and if it does not yet exist you can redirect the hit to a page that actually generates and caches the content in that file; the next time somebody accesses it (say from a Slashdot crowd), it will be the static version being served; A cronjob deleting files 5 minutes or older every 5 minutes takes care of pruning. You can also use this technique within PHP or Perl scripts quite easily (including files in PHP is easy, as are file_exists checks); combined with FastCGI and decent database layouts, you can grow quite a bit before resources become a problem. Using both of these techniques I’ve survived several slashdottings and similar traffic surges without actually having to be present when the foo hits the fan
If you want to go REALLY fancy, try implementing a load indicator and have your website change behavior depending on load (dynamic page hits per second for the last 10 minutes, for instance, is a nice indicator). If it hits a threshold, temporarily disable most dynamically created things (random image, “hottest forum post”, recent comments, etc.), or have users specifically request a refresh on them