I viewed the log files and has found very strange thing:
Googlebot and over search engine robots downloads only 10% - 20% from the real sise of the pages!
I have examined several thousands lines.
For example, just one string:
184.108.40.206 - - [22/Feb/2007:00:24:45 -0800] “GET /Art_photos/Country_Buildings-25.htm HTTP/1.1” 200 3775 “-” “Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)”
But real sise of the file http://gdpit.com/Art_photos/Country_Buildings-25.htm 17392 bites!!!
Yahoo! Slurp have similar problems.
I’m just wont to understand. This is the robots or dreamhost spare on the bandwith?
Some one have similar problems??