I had this question running thru my head today, if someone can explain it to me it’d be great.
Lets say you buy a server in a datacenter, you usually get 1000GB or 2000GB bandwidth and no more. How is this measured? I mean, if 1000 people came on your site downloading 1 GB each in the same time your server wouldn’t normally be able to take that. So if a datacenter has a T3 line thats 44,000 Mbps which is around 5 GB bandwidth for a second shared across all its servers. So if this is the case why do you need to buy another server if you want to have more then 2000 GB bandwidth a month? Why not just use the same server. I’m sorry if the question is unclear I’m a little confused on this.
Another question, lets say you have one server thats 2.0 Ghz and another which is 4.0 Ghz would the 4ghz server be able to handle more connections then the 2.0ghz? Also, when would RAM come into play? When would you want a server with only 2-4 GB RAM and when would you need one with 64 GB of RAM.
I couldn’t find a tutorial or guide explaining this, if you can point me to some information that’d be great as well. Thank you.