I think I understand what you are asking, and I think you are asking all the right questions. While I am not the "expert" on all these things, I have learned a few things that might be helpful. Hopefully some of the real wizards with this stuff will join in and add to my initial comments.
I tend to begin my initial research into why a site performs slowly with a some basic steps:
1) Log into the shell with an ssh client and note the server load at various times of the day - particularly when the site is performing badly."w" will do, and given the nature of the multi-core servers DH uses, I don't get concerned till the load gets up above 10 or so (though YMMV). Note that an exceedingly high load may be evidence of an NFS bottleneck or other filer problem more so than a real "load" on the server.,
2) When performance varies, I try to evaluate to what degree my own connectivity issues might be responsible - traceroute is my first tool here, and I'm looking for problematic routing, malfunctioning routers, and other indications that latency or sporadic connectivity may be interfering with my receivng served pages efficiently.
3) I use the FIrefox extension YSlow to develop metrics for the basic loading of my pages and to see where the optimization of the rendered page is deficient.
4) Where dynamic pages generated using MySQL are involved, I try to optimize MySQL operation as much as practicable. Things like maximizing the number of queries per connection, using the most efficient queries, and indexing tables are my starting points for this - as you know, this is a whole science unto itself.
5) Finally, I look at the code itself, initially looking for obviously inefficient or redundant stuff that should be optimized, or eliminated altogether.
These initial steps, and collecting and evaluating the connectivity and loading conditions during times of good performance and bad performance, will often indicate whether or not the "shared" aspects of a shared server are the source of the bottleneck, and will help DH Tech Support identify the source of the problem.
Of course, optimizing databases, modifying code, and having "hoggish" neighbors throttled will only accomplish so much; some sites will require the "upgrade" or even a dedicated server (or two), and I don't know of any way to avoid that, but I have found that the things I have described above, if properly managed, will get you a lot of usability out of DreamHost servers.