Permission troubles on main directory


#1

Whenever someone visits my website with IE using “http://www.mydomain.com” all my main page’s images are broken. However if they visit my website using “www.mydomain.com” or “mydomain.com” the main page loads correctly. And on Firefox it loads correctly with any condition, the problem is only with IE. I don’t have any fancy codes running, my main page is just images and html/css files. The only thing I can think of that might be causing this is that I’ve enabled .htaccess link protection in the DreamHost control panel on most of my sub directories. What boggles my mind is that I have not enabled .htaccess on the main directory, and the page loads fine using other conditions. According to the error portion of my stats, about 20-30% of my site’s visitors are affected by this problem.


#2

try adding your own domain to the list of allowed sites. i had that problem with a different proivder once, that fixed it.

worth a shot :slight_smile:


#3

Otherwise tell your domain name and let us try with out browsers. Maybe we’ll think of something else, too.

TorbenGB
Try out DreamHost with a free WebIDPrices, options


#4

It doesn’t seem like I’m able to enter my main directory as an allowed site (besides what it puts for the default), since the .htaccess options in the control panel points out “no www necessary”. But I don’t see how that could be helpful considering that my subdirectories have nothing to do with my main directory.

My website’s address FantasyAnime.com.


#5

Hmmm… nice site! Nothing broken when I visited with either IE6 or Firefox. Then again, maybe you already fixed whatever problem was lurking.

~Chell


#6

Actually I didn’t do anything. There’s another important detail I forgot to mention, this problem doesn’t affect every WinXP computer. On this computer I’m using now, my website loads properly on IE with the “http://www”. When I test my website on my father’s WinXP computer I experience the problem. This problem came to my attention only because several of my site’s visitors told me about it. I can only conclude that this problem is caused by a certain condition relating to WinXP. I thought maybe it was because I installed SP2, but after research I found that computers with SP1 and SP2 encountered this problem. I tried my website on Mac OS X as well; it loads properly using any condition on Firefox, IE for Mac, and Safari.

It’s a very unusual problem. The best conclusion I can reach is that DreamHost’s .htaccess setup somehow causes a conflict with an IE bug that only surfaces when irregular conditions are met. As mentioned previously, about 20-30% of my site’s visitors experience this problem, so its something I’m concerned about and hope can be resolved. I figured that percentage estimate due to a considerably high amount of failed requests for my main page’s images.


#7

I also don’t see anything wrong, regardless whether I use Firefox or IE. So it seems it’s a setting with your local pc’s rather than the website itself?

TorbenGB
Try out DreamHost with a free WebIDPrices, options


#8

I’ve tried observing IE’s privacy settings, but the different machines all seem to be on the same default settings. Any ideas? I never experienced such a problem with my previous host.


#9

I experienced something similar when trying to restrict access to PDF files on a client’s site using htaccess. It was supposed to throw the visitor to the PDF listing page if they came directly to the PDF via search engine or other external link, and this worked on 90% of the visitors, but for 10% they just couldn’t access the files period.

I disabled the htacess and never figured out what was causing this or how to solve the problem.

[color=#0000CC]jason[/color]


#10

if its a new domain… it prob hasnt prograted properly yet. Give it atleast 1 day.

GottaDeal.com Deals & Coupon Codes. Why Pay Retail?


#11

The whole process of preventing hotlinking relies on whether or not the web server is sent a referrer header by the browser. If this is not sent by the browser, or stripped by some firewall or proxy, naturally the hotlink prevention code will not work as expected. For example, it is possible to configure Mozilla browsers to not send a referrer header. Firewalls or proxies might offer “privacy” feautres that intercept web traffic to monitor and possibly modify it.

Before anyone seeks help, they should either check the web traffic themselves or their Apache log files. For example, when using this directive,

RewriteRule \.(gif|jpe?g|png)$ - [F]In the access.log file one should see:

"GET /images/01.jpg HTTP/1.1" 403 230 "BLOCKED_URI" "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.7.5) Gecko/20041107 Firefox/1.0"Where BLOCKED_URI is a URI that your RewriteCond directives match against.

If however in the access.log you see:
“GET /images/01.jpg HTTP/1.1” 200 43469 “-” “Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.7.5) Gecko/20041107 Firefox/1.0”[/pre]

Well then the visitor did not send a referrer and was able to download the image. So if your RewriteCond directives allow for blank referrers, then this works as expected. However, if your RewriteCond does not allow blank referrers, then you should instead see:

And to avoid issues caused by a browser caching either the resource or the response, using something like wget or LWP::UserAgent to confirm that the web site is processing requests as expected. There is also http://web-sniffer.net/ however it always sends a referrer and you can not change its value.

:cool: Perl / MySQL / HTML CSS


#12

I recall getting “on the trail” of the proxy possibility but due to limited technical experience I gave up since the client said “open them up” thus not paying me to look into it :slight_smile:

Your reply is followable, I see the sense, would you mind giving an example of the proper way to consider blank referrers?

While I was able to understand most of your post the last paragraph is mostly foreign language to me, feel free to dumb it down if ye be so inclined :slight_smile:

thanks man!

[color=#0000CC]jason[/color]


#13

An HTTP server is a web site.
So something that connects to a web site is an HTTP client.
When a client connects to a server, it sends an HTTP request.
The server should respond with an HTTP response.

Web browsers are clients, so are search engine robots, comment spammers, copyright infringement spybots, RSS aggregators, CDDB-enabled applications, the ‘wget’ program, and of course programming languages allow programmers to develop clients of their own.

The ‘wget’ program is a client that can be run from a shell user or CGI script. The LWP::UserAgent is a Perl library that can be used to make clients in Perl.
http://web-sniffer.net/ is a web site that acts like a client: You specify a URI to request, and it will show you the response.

When a client sends its request, it has the option of letting the server know the URI from which the URI that generated the request was obtained. In otherwords, “Hi Bob, Larry told me I could get this page from you.” or “Hi Bob you gave me this page but I need the image that goes along with it.”

So what happens if a person types in a URI into their browser, or has it set as a bookmark? There is no referrer. Also, a client can obviously be developed with the intent not to send referrers, or allow the user to configure it to not send a referrer, or to even fake the referrer.

So the hotlink prevention code depends on this referrer, which might not be there, or might be faked.

Anyways, when troubleshooting hotlink prevention, you will have trouble if you use a browser. Browsers have two types of cache: memory and disk. So what happens if you go to a web page that is allowed to link to images? The images load, and they are stored in memory or disk cache if possible. If you then go to a page that is not allowed, or perhaps the first page is no longer allowed to link to the images, you got a problem. The browser might pull the images from its cache and never bother actually asking for them a second time. This means even if the hotlink prevention code is correct, and the browser is sending the referrer, it is still possible for a “blocked” page to show the images. The reverse may also happen: An “allowed” page might not show the images if previously a “blocked” page was viewed.

So the proper way to troubleshoot whether or not the hotlink prevention code is working is to use a different client, where you can be sure the responses aren’t being cached and you can specify any referrer you want. Think of it as being thorough.

For example, if the DreamHost web panel spits out .htaccess file like so:

RewriteEngine on RewriteCond %{HTTP_REFERER} !^$ RewriteCond %{HTTP_REFERER} !^https?://(www\.)?example.com/?.*$ [NC] RewriteRule \.(gif|jpg|jpeg|png|mp3|mpg|avi|mov|swf)$ - [F]
Edit: This means “forbid access to urls that end in these extensions if the referer is not blank and the referrer is not example.com or www.example.com

and you have a GIF file in the same directory as the .htaccess,

Then log into shell, and run the following commands:

  1. wget --header=‘Referer: http://example.com/http://example.com/filename.gif

  2. wget --header=‘Referer: http://www.example.com/http://example.com/filename.gif

  3. wget --header=‘Referer: http://not.example.com/http://example.com/filename.gif

  4. wget --header=‘Referer: http://www.dreamhost.com/http://example.com/filename.gif

  5. wget http://example.com/filename.gif ((no referer is sent))

  6. This should show ‘200 OK’, since example.com does not match last regexp

  7. This should show ‘200 OK’, since www.example.com does not match last regexp

  8. This should show ‘403 Forbidden’, since not.example.com matches both regexp

  9. This should show ‘403 Forbidden’, since www.dreamhost.com matches both regexp

  10. This should show ‘200 OK’, since ((no referer)) does not match first regexp

And for those who post to the forum asking for help. We’re not psychic. Please post the URI to an image in addition to the contents of the .htaccess file so that the gurus can help.

:cool: Perl / MySQL / HTML CSS


#14

Twista: I’ve had my domain since 1999, so its not new. I’ve been with DreamHost for several months now.

Atropos7: Your tech talk is going way over my head. x_x Can you explain in more simpler terms a way I could resolve this? That is, if there is a way. I wish I could upload my own .htaccess files, but they’re not visible in the directory after the FTP upload. I assume DreamHost has them blocked.

As a side note I just want to point out again that my problem is occuring due to generic circumstances. I mean, all I’ve done is enabled .htaccess hotlink protection for my subdirectories in the DreamHost control panel. Considering that, I’m sure other DreamHost users are affected by this problem.


#15

They are not being blocked. It is a OS convention that filenames beginning with a period are “hidden,” since they are most often configuration files for programs and not simply data files.

https://panel.dreamhost.com/kbase/index.cgi?area=2932

In shell, one uses the ls command to get a directory listing. To get the ls command to show “hidden” files, you need to add type “ls -a” (type “ls --help” to see why)

As for as the problem with your site, there doesn’t seem to be any. It looks exactly the same in both Firefox and IE 6, with no missing images.Though your original statement does not make sense:

When you type a URI into the browser location bar, it is up to the browser to decode it properly. If you truly meant to say “http://www.fantasyanime.com” gives different results than typing just “www.fantasyanime.com”, then you have a problem with your browser, because as far as the web site is concerned, they are exactly the same. The browser should do this for both:

  1. Extract hostname ‘www.fantasyanime.com’ and get its IP address.
  2. Connect to IP address on port 80
  3. Request ‘/’ with the Host header set to ‘www.fantasyanime.com

And thats it. So you can see the web site is never going to know if the person left off the ‘http://’ or not. However, the browser could be buggy, and instead of sending a proper URI for the referrer, sets it to what was typed, or otherwise gets it wrong. ie,

IE is now downloading http://www.fantasyanime.com/fa_bnr_anime.gif after performing above steps for person typing just ‘www.fantasyanime.com’:

  1. Extract hostname ‘www.fantasyanime.com’ and get its IP address
  2. Connect to IP address on port 80.
  3. Send request for ‘/fa_bnr_anime.gif’ with the Host header set to ‘www.fantasyanime.com’ and the Referer header set to ‘www.fantasyanime.com

Oops…the hotlink prevention code sees that the ‘www.fantasyanime.com’ Referer header does not match ‘http://www.fantasyanime.com/’ or ‘http://fantasyanime.com/’ and tells IE it can’t have the image. Bad IE, bad. Though no recent version of IE should be doing that…

:cool: Perl / MySQL / HTML CSS


#16

I doubt it, especially if your computers experience the problem but ours don’t. Unless the problem get fixed between the time you posted here and when we posted our responses.

But I’ll try to duplicate what you might have done. Here goes.

main domain: overground.openvein.org
subdirectory: images

  1. Go to Htaccess/WebDAV panel, and select ‘www.overground.openvein.org
  2. Specified directory “images”
  3. Selected “Forbid linking to files in this dir”
  4. Extensions: gif jpe?g png
  5. Did not specify other hostnames allowed for linking.

An .htaccess file was placed in that directory.

RewriteEngine on RewriteCond %{HTTP_REFERER} !^$ RewriteCond %{HTTP_REFERER} !^https?://(www\.)?overground.openvein.org/?.*$ [NC] RewriteRule \.(gif|jpe?g|png)$ - [F]Visit http://overground.openvein.org/

The first “Atropos” image is in the images directory. The second is in the “main” directory.
I do not have an .htaccess file in the “main” directory.

When pasted into the IE 6 location bar, these all show same results:
overground.openvein.org
www.overground.openvein.org
http://overground.openvein.org
http://www.overground.openvein.org

Both images show up. I added a link to one of your images for effect. It should not show up.

I logged into a shell user, and this is my session:

[code]user@machine:~/test$ wget http://www.fantasyanime.com/finalfantasy/ff1old/fflogo.gif
–01:17:34-- http://www.fantasyanime.com/finalfantasy/ff1old/fflogo.gif
=> `fflogo.gif.3’
Resolving www.fantasyanime.com… done.
Connecting to www.fantasyanime.com[205.196.216.43]:80… connected.
HTTP request sent, awaiting response… 200 OK

user@machine:~/test$ wget --header=‘Referer: http://www.fantasyanime.com/http://www.fantasyanime.com/finalfantasy/ff1old/fflogo.gif
–01:11:39-- http://www.fantasyanime.com/finalfantasy/ff1old/fflogo.gif
=> fflogo.gif.1' Resolving www.fantasyanime.com... done. Connecting to www.fantasyanime.com[205.196.216.43]:80... connected. [b]HTTP request sent, awaiting response... 200 OK[/b] .... user@machine:~/test$ [b]wget --header='Referer: http://fantasyanime.com/' http://www.fantasyanime.com/finalfantasy/ff1old/fflogo.gif[/b] --01:11:54-- http://www.fantasyanime.com/finalfantasy/ff1old/fflogo.gif =>fflogo.gif.2’
Resolving www.fantasyanime.com… done.
Connecting to www.fantasyanime.com[205.196.216.43]:80… connected.
HTTP request sent, awaiting response… 200 OK

user@machine:~/test$ wget --header=‘Referer: http://overground.openvein.org/http://www.fantasyanime.com/finalfantasy/ff1old/fflogo.gif
–01:12:14-- http://www.fantasyanime.com/finalfantasy/ff1old/fflogo.gif
=> fflogo.gif.3' Resolving www.fantasyanime.com... done. Connecting to www.fantasyanime.com[205.196.216.43]:80... connected. [b]HTTP request sent, awaiting response... 403 Forbidden[/b] 01:12:14 ERROR 403: Forbidden. ..... user@machine:~/test$ [b]wget http://www.fantasyanime.com/fa_bnr_rpgs.gif[/b] --01:23:03-- http://www.fantasyanime.com/fa_bnr_rpgs.gif =>fa_bnr_rpgs.gif’
Resolving www.fantasyanime.com… done.
Connecting to www.fantasyanime.com[205.196.216.43]:80… connected.
HTTP request sent, awaiting response… 200 OK

user@machine:~/test$ wget --header=‘Referer: http://www.fantasyanime.com/http://www.fantasyanime.com/fa_bnr_rpgs.gif
–01:24:04-- http://www.fantasyanime.com/fa_bnr_rpgs.gif
=> fa_bnr_rpgs.gif.1' Resolving www.fantasyanime.com... done. Connecting to www.fantasyanime.com[205.196.216.43]:80... connected. [b]HTTP request sent, awaiting response... 200 OK[/b] .... user@machine:~/test$ [b]wget --header='Referer: www.fantasyanime.com' http://www.fantasyanime.com/fa_bnr_rpgs.gif[/b] --01:26:16-- http://www.fantasyanime.com/fa_bnr_rpgs.gif =>fa_bnr_rpgs.gif.2’
Resolving www.fantasyanime.com… done.
Connecting to www.fantasyanime.com[205.196.216.43]:80… connected.
HTTP request sent, awaiting response… 403 Forbidden
01:26:16 ERROR 403: Forbidden.
[/code]
Yup, the hotlink prevention code is working just fine. If you check your logs and see that your IE machines are sending referrers without the ‘http://’ and/or trailing slash, you’d have to modify the hotlink prevention code to make up for such a buggy browser.
:cool: Perl / MySQL / HTML CSS


#17

Atropos7,

I followed, in theory, most of what you discussed above, but was lost by the more technical bits.

But I have to say, it is very kind and generous of you to take so much time (and effort) to help find an explanation for what is happening… or at least to better describe what is happening.

It’s not everyday that someone makes such an effort to help out a fellow DH denizen… and is patient while doing so.

Thanks for your contributions to this forum.

Cheers,

Jem


#18

Thanks, Atropos7. Your technical explanation still went over my head, but pointing out how to make .htaccess files visible on the side of the hosting account was extremely helpful. I’ll fiddle around with various url rewrite codes until I find the one that doesn’t cause the unusual problem.


#19

eek

I am printing your posts for reading in mah easy chair tonight, thanks so much man!

[color=#0000CC]jason[/color]


#20

Good luck with that…since you don’t see the problem with Firefox, I doubt the “problem” has anything to do with the hotlink prevention code, and it that case you’d be better off analyzing the log files for your site to see exactly what your IE browser is doing. The hotlink prevention code depends on the browser to do its job right, after all - is it sending a proper referrer, and is it caching the responses to its requests?

:cool: Perl / MySQL / HTML+CSS