Making Change to httpd.conf on Shared

I’m working on getting a small site set up. One of the instructions state

Edit Apache’s config to allow execution of cgi scripts within the directory:

<Directory “/path/to/directory/”>
Options ExecCGI

Now, I would assume that because I’m on shared hosting, accessing the httpd.conf file wouldn’t be allowed. So is there any way to make the required changes I need?
Okay, so apparently I can create an .htaccess file with the this

Options +ExecCGI

And, assuming DH configured the httpd file to allow it, it’ll do what I want. I went ahead and created the file in the directory of the site I’m working on, but I still get a 403 error. So it appears that the htaccess file isn’t doing its magic?

Make sure that the CGI script is set as executable, and that neither the script nor the directory containing it are writable by users other than yourself (i.e, directory permissions are set to 755, not 775 or 777).

Additionally, make sure that the CGI script has an appropriate first line (e.g, “#!/usr/bin/perl”), that it has an appropriate file extension to be executed as a CGI (e.g, “.cgi”) and that it is encoded using UNIX line endings.

If that doesn’t help, let me know what site you’re having issues with and I’ll take a look.

Confirmed that all the directories are set to 755, the cgi scripts have the appropriate extension, the first line is correct, and the encoding is correct. The scripts are executable, and I’m still getting the error that’s apparently caused by the httpd.conf issue.

The site is The strange thing is, the python scripts that download and compress the images is working just fine. So when you click the button that says “Rip and Zip” everything works as normal. When you click the “download .zip” button, it throws a 403 error. According to the installation instructions, this indicates that Apache isn’t letting the CGI scripts execute.

I’ve tried running the site on a local Apache server, and I’m not getting any issues when I configure my httpd.conf correctly. So I don’t think it’s an issue with the site?

Thanks very much for the prompt reply. Really appreciate it.

The information I’m seeing indicates that, as of the last 403 error you got, permissions were set incorrectly on the “rips” directory (error message was “directory is writable by others”). Permissions do look correct now, so I can only assume you changed it after that error message?

Didn’t mess with the site after the last error message. the “rips” directory is set to 755, though when I try and access the page, it ends up giving me a 403. However. The page that’s being accessed is, for example, Accessing it directly allows me to download the .zip archive containing the images. Accessing it via the button on the main page of the site, the “” button, I get the 403?

It’s not a massive issue, I’ve realized, because the scripts are executing, the images are being compressed, and it is moving them to the right directory, but it’d be nice if it’d start the download when the button is pressed.