Images occasionally not loading



I’ve got a new page on my site with lots of thumbnails, and occasionally when I visit several of the images fail to load, and I see only the “alt” text. Which images varies, and each time I reload it changes. Because this is sporadic, I’m assuming it is tied to server load, but I’m wondering if maybe I’m just being overzealous on how much data I have on one page.

Is there a good rule of thumb as to how many images/bytes should be used if you want your page to reliably load? I’ve got ~60 150x100 images, around 40k each. Too many?

The page is:

thanks for any help!


It loads fine on my machine right now, though it does take 12s to fully load. I think all modern browsers do incremental rendering so I can start reading and the pages finishes loading before I get to the bottom.

I recommend using Firebug and YSlow. YSlow had a couple of recommendations for your page, but you were correct in assessing that the sheer number of images you have makes up the bulk of the time in displaying your page.

At this point, I think it’s mostly a case of trading off user perception - do you want a complete page of all your work or do you want to give your prospects an impression of “hey, this website is fast”? A lot of first-time visitors won’t go to page 2 so it’s a serious decision.

What are [color=#CC0000]50DISK50[/color], [color=#CC0000]3DOM50[/color], and [color=#CC0000]1IP1DOM50[/color]?
They’re Dreamhost coupons!


Thanks, I will try those! Though the clever people at dreamhost support I think identified the issue I was seeing. They took a look at the logs and guessed by my access patterns that I was running with the FasterFox extension, which by default (in “TurboCharged” mode) exceeds RFC specs and opens 50-100 connections, putting a big load on webservers. So I turned it down, looks better so far.

Thanks dreamhost!


Yep! That puppy can really hammer a server - imagine if several people are using it at the same time on a shared server. It’s almost a DOS attack! :wink:

Probably as good a time as any for a little "webmaster defense" training from the Fasterfox FAQ:

[quote]Because some websites may not have the resources available to support the enhanced prefetching feature, it may be easily blocked by webmasters.

Prior to generating any prefetching requests, Fasterfox checks for a file named “robots.txt” in your site’s root directory (subdirectories are not checked). If this file contains the following 2 lines, no prefetching requests will be made to your domain:

User-agent: Fasterfox
Disallow: /[/quote]