Need larger upload_max_filesize and longer timeout time


I am writing a script to allow members to import their products into our site via tab-delimited txt file. Unfortunately, the limit for a file upload is 7MB.

We need a little more, like about 20M so people can add at least 10,000 items as a standard maximum and more if they are manually approved by us. In addition, we probably need to extend the timeout (not sure what the variable name is) to give more time for larger files to import/export in order to prevent truncated/corrupted results. So all I need to do is increase the upload_max_filesize and extend the timeout for larger files (again, not sure what variable controls this).

Does anyone know if an easy way to do this such as through the .htaccess? I’ve read about this, but it says it’s dangerous if I don’t know what I’m doing:

I just upgraded my domain to PHP 5.3 (Fast CGI), so the “easy” phprc file is the only method that will work? I only need it to be modified for just one directory within one website, not my entire account.

Does anyone know how to customize it for just one directory within one website? NOTE: I have a shared web account at Dreamhost, not a private server.


I seem to have gotten it working for the one website (just not for a specific upload directory). I have another odd problem though.

After I upload a large file (8.5M text file), I run a loop to perform sanitization and analysis on the data, such as checking to see if the user uploaded the required header columns (title, description, product_cost, etc). My script is now throwing an error that the first header column is missing, which doesn’t happen when I upload a smaller file with less items.

Is there some side-effect and cure for this?


Nevermind, I think I have much of it fixed. My issue now is importing 10,000 items into mysql without timing out. Has something to do with making the while loop more efficient for each of the 10,000 item mysql inserts, if possible…

Thanks for looking

To load large amount of data into MySQL, the most efficient way is to write all records to a file and load the file into mysql. If it is only 10,000 items, it should be able to do it in a few seconds.