Why Googlebot download only a part of the pages?

I viewed the log files and has found very strange thing:

Googlebot and over search engine robots downloads only 10% - 20% from the real sise of the pages!
I have examined several thousands lines.

For example, just one string: - - [22/Feb/2007:00:24:45 -0800] “GET /Art_photos/Country_Buildings-25.htm HTTP/1.1” 200 3775 “-” “Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)”

But real sise of the file http://gdpit.com/Art_photos/Country_Buildings-25.htm 17392 bites!!!

Yahoo! Slurp have similar problems.

I’m just wont to understand. This is the robots or dreamhost spare on the bandwith?

Some one have similar problems??

Have you checked the cached copies of your pages on the search engines to confirm that they contain the full page contents?


Save [color=#CC0000]$50[/color] on DreamHost plans using [color=#CC0000]PRICESLASH[/color] promo code (Click for DreamHost promo code details)