Dreamhost's policy on remote file access

software development

#1

Can anyone tell me whether the new policy to dis-allow accessing remote files through php will be a blanket policy that halts the use of RSS feeds?


#2

shouldn’t do that at all no

I’ve never seen an rss reader that used that function

[color=#0000CC]jason[/color]


#3

[quote]shouldn’t do that at all no

I’ve never seen an rss reader that used that function

[/quote]

Nope, definitely not - the policy change is for that very specific function only, not for the generic act of “grabbing the content of a remote file and doing something with it”. RSS, Atom, etc. tools are fine, policy-wise.

Basically, we’re just trying to thwart a recent onslaught of security problems.

  • Jeff @ DreamHost
  • DH Discussion Forum Admin

#4

I use PHP to include some RSS feeds. Using PHP, as far as I know, requires you to read the external file in some manner, such as the fopen(), file(), or the CURL functions. According to the PHP docs, those functions require fopen wrappers to be enabled.

Indeed, my testing with my local copy of PHP confirms that. If I disable allow_url_fopen on my local copy, I can no longer access the remote RSS feeds. That said, the Dreamhost change does not, in fact, appear to have caused any problems reading external files.


#5

I too need an alternative. My news page on my main website : http://www.infinity-zero.net links to a script on my forums forums.infinity-zero.net

<?php include("http://forums.infinity-zero.net/ssi.php?a=news&show=7"); ?>

While I might be able to move that script from my forums account to my default domain account I have a second script which takes RSS feeds from my blog hosted by google and then makes an image out of it.

What can I use to get the RSS feed?

Can I compile my own instnace of PHP, run it as a CHI, and then enable the url thing?


#6

Nope. I just checked. Its not possible for me to move the ssi.php script as it intergrates with the rest of the IPB forums. What can I do?


#7

Haven’t looked at their source code yet, but MagpieRSS has this on their page: “Does not use fopen(), work even if allow_url_fopen is disabled.”

So maybe you could use that script, or copy whichever method they’re using to retrieve the RSS feed.


Pointy-Ears.net


#8

Just minutes ago, the change was apparently finally implemented. All the external RSS feeds i was displaying died and my currency conversion, which uses an external XML file, quit working as well. This is totally unacceptable. If there is not some other way to grab the content of an external file, I’m afraid I’ll have to change hosts.


#9

I’m using MagpieRSS to syndicate my feeds, and I’m having no problems with them.


MacManX.com
I don’t work here. I’m just your typical support forum volunteer.


#10

[quote]This is totally unacceptable.

[/quote]

We actually publicly announced this change over a month ago, though for some inexplicable reason not all customers were constrained. While this does regrettably inconvenience some people, I’m not sure I’d go so far as to say that it’s unacceptable.

While we empathize with your situation, URL-based includes tended to entail a number of security and performance related issues that we had to head off. This is necessary in order to maintain the stability and integrity of our hosting servers and - by extension - your site(s).

As for workarounds, various people have been using cURL to pull content from outside sources. I imagine there are other implementations of this feature as well.

  • Jeff @ DreamHost
  • DH Discussion Forum Admin

#11

Thanks for that. Yep. I’ve found that you can work around it using fsockopen instead of fopen or file_get_contents. It’s just a bit more work.


#12

I use custom call buttons for my service since you made your change they no longer work on the MANY domains I host with you. Is there a work around on this? It calls up an image. here is a piece of the code.If not I also will have to find other means of hosting :frowning:

Thanks

function imageOutput($filename) {
//this function outputs an image when run.
Header(“Content-type: image/gif”);

// get contents of a file into a string
$handle = fopen ($filename, “r”);
$contents = fread ($handle, filesize ($filename));
echo $contents;
fclose ($handle);
}
if ($good)
imageOutput($filename);


#13

It is true about the announcement being over a month ago. I waited with “bated breath” till (and through March 19th), and initiated a couple of threads re the consequences to various 3rd Party scripts on this board. The 19th of March came, and went, with no change on any of my domains…I even posted here about why phpinfo showed allow_url_fopen still enabled and got no answer, other than one post which suggested it wasn’t relevant for PHP as CGI.

Tested all my sites, all was well…no problem. now, with three sites under development and no time reserved to troubleshoot this change, it occurs. No better than no warning…actually worse, as time and effort was wasted in the days surrounding the 19th of March for no reason.

Guys, it is your host, and you should certianly do with it as you will…but common courtesy, not to mention a modicum of respect for your customers, requires that you do what you say you will do, when you say you will do it…If you are not going to do that, than don’t bother pretending that you are keeping your customers informed…just do as you please and let the chips fall where they land.

So far:

All Mambo CMS Newsfeeds are broken…

still looking for additional collateral damage.

This sucks much worse than when I had prepared myself and clients for potential problems on March 19th…

Live and learn,

rlparker


#14

We also are seeing a lot of damage with the change not just with my code I posted above. You know a lot of us use newsfeeds,other types of PHP scripts,codes,programs,etc and we don’t ALL look at every tiny bit of code. So I had no idea this was going to cause us such a mess. But now I do have an idea of what a mess this is!

I have had around 100 or so people sign up under me for dream host accounts,with that said all the problems we have had in the last few months with data bases down,servers down and now this I’m really at a loss. Some of them are having problems now with this and are pretty upset. But like you said RL it’s dreamhost servers and they can do as they wish. It’s just to bad with this change it’s causing such a major head ache.


#15

I agree that this was handled badly. I got an annoucement this morning indicating that we should be using cURL instead of the file system functions to get remote files. Had this been indicated back when the original annoucement was made, instead of the post here telling us it wouldn’t make any difference, we would have had some time to prepare. It seems to me that I looked into cURL some time ago and it was not enabled on the DH servers, so I had discounted the possibility of using it now.

At any rate, cURL is now enabled and works just fine to replace any constructs where you may have been using fopen or file_get_contents to get a remote file. I had a little problem getting cURL enabled on my Windows development machine. For any others that are having the same problem, see http://zend.com/pecl/tutorials/curl.php for some tips on enabling it. The one thing that page does NOT mention is to copy the file libeay32.dll from the PHP dlls directory to your Windows system directory. Once done, everything worked correctly.

Good luck to you all.


#16

This has cause a big problem on my site as well. I use Magipie on my site and it has been “incapacitated.” I’m basically a novice php user but I have a feeling Magpie itself is “allow_url_fopen” immune, however the “require_once” calls used to include it are not.

I have been forced to try and learn cURL in the few minutes I have between other important jobs and I am failing miserably.

I agree that there could have been a bit more information given to inexperienced users that may not have been aware that the changes would affect them. When I saw the first notice I had no idea what “allow_url_fopen” meant, so I asumed that it was an advanced feature that woul dbe recognized by people that used it. If the message had included some language indicating that “require” and “include” would be disabled I would have had a much better chance of avoiding a sudden semi-catastrophic site failure. Dreamhost has always done right, in my experience, in the past, however I feel in this instance more could have been done to explain the change and the ramifications of disabling these common php functions.

I understand the need for security, and I understand why the change was needed. However, the “we told you so” attitude seems uncharacteristic of this company.

Finally, I know many of you have a firm grasp around the testes of php and curl. Perhaps someone could explain to me how to get cURL to retrieve a php script file and then run that script. All I get is the text of the document displayed on the html page. Three hours of Googling has been fruitless.

Thanks for your time.

-Jay


#17

If it were me (and it’s not)… I would curl it to get it, save it localy, and then use inclide or include_once on the local version… If appropriate, I might also check the age of the local version and only retrieve it (curl) once a minute, or 5, or 20.

-Jason

I40.com - Home Page
MP3Mystic - Personal Streaming Music server.
(No longer hosted with Dreamhost)


#18

Jay-

I too am learning my way around this, so my answers should not be considered authoritative. Last night, however, I did manage to get my newsfeeds back using magpierss. All my domains in question are running PHP as CGI (DH recommended and now the DH default for new domains). User macmanx also reports using magpie without any problems.

To see my quick-and-dirty workaround in action you can visit http://www.helium-labs.com/ and select the “Science News” link or, for another example (of the code I included below) visit http://www.cgiservices.net/sciencenewsx.php.

In the hope that this could be of help to you, I suggest you:

  1. Get the latest magpierss from sourceforge (magpierss-0.71.1.tar.gz as of last night)

  2. Install into a web accessable directory as per instructions included in the tarball. The only trick here (if you don’t want to edit the class file) is to store the 5 indicated files into a directory called magpierss.

  3. edit/modify one (or more) of the example .php files included in the tarball to reflect your needs.

  4. Integrate those edited .php files into your site to display content as desired.

I did not find any problems with “reguiring” the necessary files to operate magpie.

I have included a short example .php file that worked for me, showing two different methods of callilng the rss feed with the magpie rss_fetch function:

— begin code ------

News <? require_once('magpierss/rss_fetch.inc');

$url = ‘http://rss.news.yahoo.com/rss/science’;
$rss = fetch_rss( $url );

echo "Latest From: " . $rss->channel[‘title’] . “
”;
echo “(Links will open in a new window)

”;
echo “

    ”;
    foreach ($rss->items as $item) {
    $href = $item[‘link’];
    $title = $item[‘title’];
    echo “
  • $title
  • ”;
    }
    echo “

”;

$rss = fetch_rss(‘http://news.search.yahoo.com/news/rss?p=RFID+Privacy&ei=UTF-8&fl=0&x=wrt’);
echo "Latest from: " . $rss->channel[‘title’] . “
”;
echo “(Links will open in a new window)

”;
echo “

    ”;
    foreach ($rss->items as $item) {
    $href = $item[‘link’];
    $title = $item[‘title’];
    echo “
  • $title
  • ”;
    }
    echo “

”;
?>

------ end sample code ---------

A file similar to this one (I used my Mambo stylesheet, and didn’t particularly code it pretty!), that lives is the directory above your “magpierss” folder installed per the instruction in the tarball should work; it works well for me and can be included itself as needed in your code or wrapped in an iframe (which is what I did for a quick and dirty implementation in Mambo to replace the now-broken Mambo newsfeed component).

To me, this seemed easier and quicker than grappling with cURL, though i do intend to attack that another day.

Regards,

rlparker


#19

I can do that, but it doesn’t make any sense to me. All the atom.xml files I want to parse, as well as all of the MagpieRSS files are already stored locally.

I already tried using inlude_once to retrieve them and it failed.

Do those files need to be stored somewhere specifically for them to be accessed “legally”?


#20

I have been using basically the same setup.

<?php define('MAGPIE_DIR', 'http://www.gloryfish.org/blogs/magpie/'); require_once(MAGPIE_DIR.'rss_fetch.inc'); $rss = fetch_rss( 'http://www.gloryfish.org/blogs/news/atom.xml' ); //push blog content onto array: $item = $rss->items[0]; //prepare strings from array: foreach($rss->items as $item) { $href = $item['link']; $href = $item['link']; $title = $item['title']; $created = $item['created']; $content = $item['atom_content']; //display content: echo "

$title

\n"; echo "

$content

\n"; echo "
\n"; } ?>

The only difference I can see, essentially, is that I am using an absoulute path to the rss_fetch.inc script, as opposed to a relative path. Would that have any effect?

Also, thank you all for responding. Makes me smile. :slight_smile:

-Jay

Update: Darn skippy that was it! Woo hoo!

I changed the paths to my locally stored content from absolute to relative and everything is fine. Thanks for the help rlparker!