Sparked by the ‘server statistics’ post … I didn’t want to hijack that thread, hence this one.
In all seriousness (and I’m really clueless about this stuff) - having been moved to an eval server for excessive CPU use I do wonder a bit. How dependent on the nature of the server itself is the assessment of what is ‘more than your fair share’ of CPU use? Is there some fixed number of seconds you get to use, regardless of what % of total availability that amounts to? How do I know what is a reasonable amount of CPU to consume? Is my gut feeling that the server matters really just one of those things that seems like it OUGHT to be true, but isn’t?
Similarly on the issue of performance, I imagine that one person hosted on a PIII may see snappier performance than 1,000 people hosted on a quad Xeon. Is there some level of traffic where I’d be considered an OK neighbor on one server, but a greedy process pig on another? Analog says I get roughly 10,000 successful page requests a day, with roughly 60,000 server requests. Is that light, medium, or heavy traffic?