/    Sign up×
Community /Pin to ProfileBookmark

[RESOLVED] Is file_get_contents safe?

Is this code safe?

[code]
<?php echo file_get_contents($_REQUEST[‘url’]); ?>
[/code]

To the best of my knowledge file_get_contents acts as a 3rd party when requesting so there is no worries about security. Is this correct?

to post a comment
PHP

33 Comments(s)

Copy linkTweet thisAlerts:
@fciApr 10.2006 — will you be letting untrusted users using it? you could at least be validating whether it is an actual URL, you may also want to try readfile.

http://us3.php.net/readfile
Copy linkTweet thisAlerts:
@bokehApr 10.2006 — Is this code safe?
<i>
</i>&lt;?php echo file_get_contents($_REQUEST['url']); ?&gt;


To the best of my knowledge file_get_contents acts as a 3rd party when requesting so there is no worries about security. Is this correct?[/QUOTE]
This is 100% NOT safe. This allows anyone access to the content of every file php has access to just by typing the filename into the address bar of their browser. For example: ?uri=passwords.txt
Copy linkTweet thisAlerts:
@UltimaterauthorApr 10.2006 — Bokeh are you telling me that file_get_contents wouldn't be denied permission if it were to request for a private file? I tested it on a file in my _private directory and it was deined permission:

"HTTP request failed! HTTP/1.1 403 Forbidden"

Does this mean file_get_contents works like a third party during it's request?
Copy linkTweet thisAlerts:
@fciApr 10.2006 — it can read any files that it has permission to rea, e.g., script.php?url=/etc/passwd

or even, script.php?script.php to view the script's source code.
Copy linkTweet thisAlerts:
@balloonbuffoonApr 10.2006 — Its a good thing I read this, as it never occurred to me that users could access any file that php can. What would be the best way to determine if the input is a url or a local file? Do you think this would be effective?
[code=php]$url = "passwords.txt";
$parts = parse_url($url);
if (isset($parts["scheme"])) {
//good
} else {
//bad
}[/code]


--Steve
Copy linkTweet thisAlerts:
@bokehApr 10.2006 — Yes even if the directory is protected .htaccess php can still read all the files if a server file path is used rather than an http path.
Copy linkTweet thisAlerts:
@UltimaterauthorApr 10.2006 — What is a better function to use than file_get_contents for this purpose to be denied access to protected directories?

And would that function also run in to the same problem?
Copy linkTweet thisAlerts:
@felgallApr 10.2006 — What you need to do is to filter the value being passed to that function so as to restrict it to just those files that you intend to allow access to.
Copy linkTweet thisAlerts:
@bokehApr 10.2006 — You need to use a switch statement to filter the files that are allowed. http://www.php.net/switch
Copy linkTweet thisAlerts:
@bokehApr 10.2006 — Or...[code=php]$allowed_urls = array('file_one', 'file_two', 'file_etc');
$default_file = 'default_file';
echo file_get_contents(in_array($_REQUEST['url'], $allowed_urls)?$_REQUEST['url']:$default_file);[/code]
Copy linkTweet thisAlerts:
@balloonbuffoonApr 10.2006 — I think he wants to allow only web accessible pages and not files on the server (and that's what I want to do, also.) What would be the most effective way of determining whether a file is on the server or otherwise? Do you think the example I have above would work well, or could it be tricked?

--Steve
Copy linkTweet thisAlerts:
@UltimaterauthorApr 10.2006 — $_REQUEST['url'] will contain an unlimited number of permitted values.

e.g. http://msdn.microsoft.com/workshop/author/dhtml/overview/ccomment_ovw.asp

e.g. http://www.luxfx.com/chart/escapechart.html

and I wish to permit the user to pages normally accessible to the browser on my own site. I'm only worried about security issues and the user accessing something on my server that shouldn't be read. Sure there isn't an easier way so the user will only be able to pull-up pages he would be able to with a 3rd party HTTP Request? That's all I'm trying to do is find a function that will treat the server it is running on as a 3rd party.
Copy linkTweet thisAlerts:
@bokehApr 10.2006 — I think he wants to allow only web accessible pages and not files on the server [/QUOTE]Whatever you do you need a default action if something is wrong. If you are just looking for an http path you could do this:[code=php]echo file_get_contents(preg_match('@^http://@i', $_REQUEST['url'])?$_REQUEST['url']:$default_file);[/code]
Copy linkTweet thisAlerts:
@NogDogApr 10.2006 — You could make sure that the input filename begins with "http:" or "https:". Thus it will only read files which are served up by a webserver. Something like:
[code=php]
$url = $_REQUEST['url'];
if(preg_match('/^https?:/i', $url))
{
if(readfile($url) === FALSE)
{
echo "<p class='error'>ERROR: Unable to read URL $url</p>n";
}
}
else
{
echo "<p class='error'>ERROR: '$url' is not a valid URI.</p>n";
}
[/code]
Copy linkTweet thisAlerts:
@UltimaterauthorApr 11.2006 — Isn't there some kind of taint mode in PHP that could be enabled?
Copy linkTweet thisAlerts:
@NogDogApr 11.2006 — Isn't there some kind of taint mode in PHP that could be enabled?[/QUOTE]
Doesn't look like it: http://www.php.net/~derick/meeting-notes.html#sand-boxing-or-taint-mode
Copy linkTweet thisAlerts:
@UltimaterauthorMay 03.2006 — Its a good thing I read this, as it never occurred to me that users could access any file that php can. What would be the best way to determine if the input is a url or a local file? Do you think this would be effective?
[code=php]$url = "passwords.txt";
$parts = parse_url($url);
if (isset($parts["scheme"])) {
//good
} else {
//bad
}[/code]


--Steve[/QUOTE]

Any answers to this question? My logic is, at this point, to allow all cross-domain HTTP Requests that point to other servers but to forbid access if it is the localhost. (If for some reason I wanted to allow access to a page or two on the localhost, I could use a switch)

I'm still curious how to detect if the file is on the localhost or not -- I've been developing plenty of Ajax applications and security is a very important factor.

Is there some sort of method that will return the domain of the file?

Maybe something along these lines would work?
<i>
</i>&lt;?php
function http_get($url)
{

$url_stuff = parse_url($url);
$port = isset($url_stuff['port']) ? $url_stuff['port'] : 80;

$fp = fsockopen($url_stuff['host'], $port);

$query = 'GET ' . $url_stuff['path'] . " HTTP/1.0n";
$query .= 'Host: ' . [color=blue]'www.yahoo.com'[/color];
$query .= "nn";

fwrite($fp, $query);

while ($tmp = fread($fp, 1024))
{
$buffer .= $tmp;
}

preg_match('/Content-Length: ([0-9]+)/', $buffer, $parts);
return substr($buffer, - $parts[1]);
?&gt;

(revised from http://us2.php.net/manual/en/features.remote-files.php)

Edit:

Is [url=http://us2.php.net/manual/en/function.fileowner.php]fileowner[/url] or [url=http://us2.php.net/manual/en/function.stat.php]stat[/url] what I'm looking for?
Copy linkTweet thisAlerts:
@bokehMay 03.2006 — [code=php]if(false !== (strpos(strtolower($url), 'localhost')))
{
# localhost
}
else
{
# not localhost
}[/code]
Well that would test the URL but my guess is there is something else wrong with your logic if you need to do this.
Copy linkTweet thisAlerts:
@balloonbuffoonMay 03.2006 — Here's what I came up with:
[code=php]$url = "http://localhost/config.php";

if (is_url($url)) {
echo "Valid URL, it is a remote file.";
} else {
echo "Invalid URL! This is a local file!";
}

function is_url($url) {
$parts = parse_url($url);
if(@get_headers($url) && strtolower($parts["host"])!="localhost") {
return true;
} else {
return false;
}
}[/code]
It should be pretty much entirely accurate as to whether the url is a local file or a remote file.

--Steve
Copy linkTweet thisAlerts:
@bokehMay 03.2006 — Here's what I came up with[/QUOTE]Was my solution too simple?
Copy linkTweet thisAlerts:
@balloonbuffoonMay 04.2006 — In one word, yes. It doesn't prevent against a user typing in "config.php" or the like as the url and get that file relative from the working folder. You certainly can't assume the user will put in a URL. Plus, what if it was a valid url, but one of the folders or or the filename contained the string "localhost"? It would still be a perfectly valid URL, but would be rejected. You must pull out the hostname and check if [I]that[/I] is localhost, you can't just assume that any instance of "localhost" will be the hostname.

--Steve
Copy linkTweet thisAlerts:
@SpectreReturnsMay 04.2006 — The runkit extention provides a sandbox if you're interested in trying to install it.
Copy linkTweet thisAlerts:
@bokehMay 04.2006 — [code=php]echo file_get_contents($_REQUEST['url']);[/code]In one word, yes. It doesn't prevent against a user typing in "config.php" or the like as the url and get that file relative from the working folder. You certainly can't assume the user will put in a URL. Plus, what if it was a valid url, but one of the folders or or the filename contained the string "localhost"? It would still be a perfectly valid URL, but would be rejected. You must pull out the hostname and check if [I]that[/I] is localhost, you can't just assume that any instance of "localhost" will be the hostname.

--Steve[/QUOTE]
Well personally I think it is an overly intricate answer to deal with a line of code that should never be run in the first place.
Copy linkTweet thisAlerts:
@UltimaterauthorMay 04.2006 — I tried both your codes. Both worked when using the text "localhost" within the URL string. However both failed when replacing "localhost" with my server's actual domain name. This is for a cross-domain Ajax requesting script to show the source code of the user's desired URL to the user in a textbox. I wish to develop the application futher and help the user avoid depreciated tags and auto-correct them. Perhaps there is a safer way to do this? Users could also put my server's IP Address... Who knows how many way the user can reference my server... I could setup a switch but chances are I'm bound to over-look one of the ways.
Copy linkTweet thisAlerts:
@bokehMay 04.2006 — Ultimater, your problem is caused because of your methodology. Instead of creating a blacklist you should be creating a whitelist. Don't check what is not allowed, check for what is allowed.
Copy linkTweet thisAlerts:
@UltimaterauthorMay 04.2006 — I finally got file_get_contents to access a password file. So the URL must start with a "/" in order to read a file on the server.
<i>
</i>&lt;?
$url = "/var";
if(false !== (strpos(strtolower($url), 'localhost')))
{
echo "localhost";
}
else
{
echo "not localhost";
}
?&gt;

echos "not localhost"
Copy linkTweet thisAlerts:
@bokehMay 04.2006 — echos "not localhost"[/QUOTE]Thats because /var is not localhost!

Ok so you want to check that the $url variable[list]
  • [*]contains a url starting ftp://, http:// or https://

  • [*]that is not http://localhost

  • [*]that is not http://12.34.56.78

  • [/list]
    Insert the following line before the call to file_get_contents() and it will abort the script if any of those conditions is not met.[code=php]preg_match('@^^(f|ht)tps?://(?!(localhost|(d{1,3}.){3}d{1,3})).+$@i' $url)) or die();[/code]
    Copy linkTweet thisAlerts:
    @UltimaterauthorMay 04.2006 — Thanks bokeh, that works nicely!
    Copy linkTweet thisAlerts:
    @balloonbuffoonMay 04.2006 — Ultimater, your problem is caused because of your methodology. Instead of creating a blacklist you should be creating a whitelist. Don't check what is not allowed, check for what is allowed.[/QUOTE]It would be impossible to create a whitelist for the entire internet. I think the application (or at least one) that ultimater is using it for is getting the source of any website with AJAX. Cerainly a whitelist couldn't be used.

    But anyways, I was just thinking about this and it wouldn't matter if a person was trying to access a file on your server, as long as its a url. "http://localhost/passwords.php" will not list your passwords (or whatever PHP content that's not echoed.) All you need to do is check the url with this:
    [code=php]@get_headers($url)[/code]If its true, its a valid url. If its false, it might be a file on your server or may be simply invalid. This should really be the only check necessary, and works the best, IMO. (Note, you must use the "@" to suppress errors when the url is invalid.)

    --Steve
    Copy linkTweet thisAlerts:
    @bokehMay 04.2006 — "http://localhost/passwords.php" will not list your passwords (or whatever PHP content that's not echoed.) All you need to do is check the url with this:[/QUOTE]Well I wouldn't want anyone accessing localhost on my server.I have admin files and other things there. It's not just a case of if someone can read the php code. Localhost should be a non public area full stop and I would always work toward maintaining that. Also The domain might have a much tighter open base directory in force than localhost.
    Copy linkTweet thisAlerts:
    @balloonbuffoonMay 04.2006 — I don't know if I quite get what you're saying...

    "http://mydomain.com/passwords.php" and "http://localhost/passwords.php" are not treated the same way?

    --Steve
    Copy linkTweet thisAlerts:
    @bokehMay 04.2006 — I don't know if I quite get what you're saying...

    "http://mydomain.com/passwords.php" and "http://localhost/passwords.php" are not treated the same way?

    --Steve[/QUOTE]
    If you believe a section of your site is private (e.g. localhost) security on that area my be more relaxed because it is a non public area. Localhost runs on a different virtual host and my operated at a lower security level or with a less restictive base directory restriction. For example, on my server I can access the majority of files on the server from localhost, including all my domains whereas if the request is directed at one of the domains opening of files is resticted to the directories that belong to that domain. I'm not saying PHP itself behaves differently on localhost but it can be configure to act differently.
    Copy linkTweet thisAlerts:
    @balloonbuffoonMay 05.2006 — Ok, I see.

    --Steve
    ×

    Success!

    Help @Ultimater spread the word by sharing this article on Twitter...

    Tweet This
    Sign in
    Forgot password?
    Sign in with TwitchSign in with GithubCreate Account
    about: ({
    version: 0.1.9 BETA 6.17,
    whats_new: community page,
    up_next: more Davinci•003 tasks,
    coming_soon: events calendar,
    social: @webDeveloperHQ
    });

    legal: ({
    terms: of use,
    privacy: policy
    });
    changelog: (
    version: 0.1.9,
    notes: added community page

    version: 0.1.8,
    notes: added Davinci•003

    version: 0.1.7,
    notes: upvote answers to bounties

    version: 0.1.6,
    notes: article editor refresh
    )...
    recent_tips: (
    tipper: @nearjob,
    tipped: article
    amount: 1000 SATS,

    tipper: @meenaratha,
    tipped: article
    amount: 1000 SATS,

    tipper: @meenaratha,
    tipped: article
    amount: 1000 SATS,
    )...