Setting apache2 environment variables on shared hosting

design

#1

Is it possible to set the apache2 environment variables with shared hosting? How? On my local machine I just add lines to the file /etc/apache2/envvars:

export varname=thisvalue


#2

You can set them in .htaccess like this:

SetEnv VARNAME thisvalue

#3

This is why I hate working with the htaccess file, every question spawns a game of twenty questions.

So it didn’t work:
.htaccess:
SetEnv TESTVAR justSomeText

testfile.php:
echo “[ “.getenv(‘TESTVAR’).” ]”;

output:
[ ]

If I define the variable inside of envvars (all this testing is on my local machine) it does work.

So there is some more magic involved here than meets the eye. Perhaps the DH settings are already correctly set for this, so, I uploaded the same code to DH:

output:
[ ]

same deal. According to stackoverflow the AllowOverrides variable must properly set, however this is in httpd.conf. As a shared host user, do I have access to that file?


#4

Setting up a local machine to mirror DH’s setup can be difficult – worthy of its own topic if you want to pursue it.

Here’s the working test I tried on DH (shared server) – the phpinfo() call is useful for examining all set variables. Access via example.com/test.php:

/home/user/example.com/.htaccess:

SetEnv TESTVAR justSomeText

/home/user/example.com/test.php:

<?php
echo getenv('TESTVAR');
phpinfo();

#5

ok, before seeing your response, I created a new folder literally called “example.com” in my local www folder (always wanted to make one for testing purposes).

The ONLY text I put in the .htaccess file and the index.php file were those single lines. It worked.

SO, the reason it is not working on my main site (again, in my local directory only, one thing at a time) is likely an overriding setting somewhere. When I figure it out I’ll post back.


#6

OK, I figured it out. I had a fundamental misunderstanding how apache reads directories and .htaccess files.

I created the following directory structure:

example.com/
├── a
│ ├── .htaccess
│ └── index.php
├── .htaccess
└── index.php

a/.htaccess:
SetEnv TESTVARA examplea

a/index.php:
<h2><?= "example.com/a" ?>
<p><?= "TESTVARA=".getenv('TESTVARA');?>

.htaccess:
SetEnv TESTVAR example

index.php:
<h1><?= "example.com" ?>
<p><?= ‘TESTVAR=’.getenv(‘TESTVAR’); ?>


<?php include ‘a/index.php’; ?>

output (when I visit http://127.0.0.1:8000/example.com)
example.com
TESTVAR=example
example.com/a
TESTVARA=

I thought (wrongly) that apache read a/.htaccess when it includes a/index.php. I need to rethink my entire approach to this problem. I wanted to keep the variables out of the top level .htaccess on my site (the password for the sql db).


#7

One option I’ve seen mentioned is to store credentials in a php.ini file – on DH, that would be /home/user/.php/7.X/phprc.

Another common method is to store credentials in a PHP include outside of the web directory. Something like this (untested code):

/home/user/secrets.php:

<?php
$PASSWORD=xyz;

/home/user/example.com/test.php:

<?php
include dirname($_SERVER['DOCUMENT_ROOT']) . '/secrets.php';
echo $PASSWORD;

Disclaimer: IANASE (I am not a security expert!)


#8

I will put it in a folder in the root directory, inside a function (trying to avoid globals).

the php.ini is also a possiblility, but I can predict the future… next year I plan on upgrading to php 7.2 (or higher) and will wonder why everything suddenly stopped working. I GUARANTEE that would happen.

This plaintext password thing really bothers me, and the only security available is security by obscurity. Awful. Dan Boneh would have flunked me for doing this.

thanks for the help!


#9

For completeness of the topic, I’ll mention one last .htaccess option. On DH, you can have a home directory .htaccess file, which provides user-wide settings. It is sort-of analogous to global settings in /etc/apache2/ (envvars, *.conf, etc). For example:

/home/user/.htaccess:

SetEnv TESTVAR example

Now TESTVAR is accessible to all sites hosted under that user (e.g. /home/user/site1.com, /home/user/site2.com`, etc). This might be too wide-spread for passwords, but it can be useful for setting user-based configuration flags, like distinguishing staging vs production users.


#10

that is a very good tip, I will incorporate that root htaccess file.

I don’t mean to hijack my own thread, but this entire discussion spawned from a security concern. All of the sensitive information on my site is located in a folder called “.htData”, theoretically the .ht prefix prevents site visitors from entering the folder, regardless of the .htaccess settings.

However, if apache crashes/stops running, are all bets off? Is this still an issue? What would happen on DH if apache crashed? Is there a way for someone to tease out php/htaccess contents?

moving that .htData folder up to the root level may be more secure, but it also mungs up the folder structure, and I must rewrite the rsyncing scripts to account for this. If it’s worth it I’ll do it.


#11

The denial of access to .ht prefix files is controlled by a Files directive in Apache’s default configuration file, but it can be overridden in an .htaccess file (intentionally or accidentally). The same hold for most settings, like the handlers for .php files, etc.

However, if apache crashes/stops running, are all bets off? Is this still an issue? What would happen on DH if apache crashed? Is there a way for someone to tease out php/htaccess contents?

A stopped or crashed Apache prevents all web access, so no-one could retrieve/manipulate any content over HTTP/HTTPS. Of more concern is the accidental or malicious reconfiguration of Apache (or the server), which could cause a data breach.

If it’s worth it [keeping credential out fo web directories] I’ll do it.

Certainly it is worth doing for new sites – most web frameworks default to this arrangement. For an existing site, it is a judgement call because there is also risk in reworking a site. One could even accidentally make security worse (example: forgetting to remove debug code that echos passwords!).

I would think keeping credentials out of the web directory can prevent a few types of breaches, but there are myriad other ways a site can be attacked and only finite time for defense. For example, I’d say having good incremental backups of a site/db is higher priority then reworking credential storage.

Obligatory Tolkien quote for my wishy-washy answer: “Go not to the Elves for counsel for they will answer both no and yes.”


#12

I remember stumbling across the Files directive many months ago, which is what gave me the idea to prefix a folder name with ‘.ht’ to help prevent access. After your other comment about moving the sensitive stuff to the root directory I went ahead and did so.

The only framework I use is w3.css, nothing else (no ruby/jquery,wordpress, etc). If there are any security holes they are 100% my fault.

All of the coding for my site is done on my localmachine, then rsync’ed to dreamhost as needed. Backups are never a problem since my bash scripts automatically take care of backups (I won’t bore you with the details). I had to rewrite those scripts to reflect the new folder structure, but it was all tested locally, then on a beta site, then on the main site.

I use a simple function I wrote for echoing text to avoid disaster (but sometimes I forget).

	function echoIt($text='<br>'){
	  global $user;
	  if ($user->accesslevel < 9)
	  	return;
	  echo $text;
	}

passwords are no problem, they are unset immediately after password_verify:

	private function isValidPassword($email){
	  global $con,$POSTdata;
	  $success=false;
	  $sql='SELECT pwhash FROM thisDB.thisTable WHERE email = ?';
	  if ($stmt=$con -> prepare($sql) ){
	    $stmt -> bind_param('s',$email);
	    $stmt -> execute();
	    $stmt -> bind_result($pwhash);
	    if ($stmt->fetch())
	      $success=password_verify($POSTdata['userpsw'],$pwhash);
	  }
	  unset($POSTdata['userpsw']);//do this asap.
	  return $success;
	}

Hopefully there are no security holes in that function…

man, this thread has really spun off the rails, but that’s ok, I really like the subject.


#13

Looks good (but I’m no pentester). Another “fun” security audit is Mozilla’s Observatory – always very sobering if a site hasn’t had any hardened:

https://observatory.mozilla.org


#14

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.