For the options, you’d best ask DH directly (i.e. how coupons interact with lifetime domains, etc.) and have something from them on that that you can call them on should it not actually work out the way you planned (it usually does, but it would be folly to trust a user to know the system well enough to answer this).
I have no downtime percentile, since I’m not monitoring DH. Every time I logged in, it was up, and I have had no complaints. Maybe others will have links to their uptime trackers. Servers will reboot about once every 1-3 months from what I have seen so far.
Concurrent Connection Limits : I believe there is a hard limit of 200 in place, and some mod_security settings that will limit to 20 per IP. You cannot fiddle with these (and they are never shown). If you are using these up with dynamic requests, it will not fail immediately, but if your processing takes too much CPU/mem/etc., you will get a mail from DH asking you to reduce your usage or moving to a higher tiered service. If your usage does not threaten to crash the server or severely affect other customers on the server, they will give you time to make adjustments, though (and otherwise disable your site by moving its directory).
Concurrent bandwidth pipe : 100mbps per shared server, at least on mine. There may be gbps links on other servers, but I doubt it. (this is where the overselling-nature of the service comes to light; 5tb/month is just over 16mbit/s continuously, if you use a 100% bw coupon, a single user would theoretically be allowed to use ~30mbit/s continuously (and we all know that traffic is not a flat curve)).
Filesize limitations : the filesystem will handle anything you throw at it (at least until your storage plan is exceeded), i.e. a 500gb file is not a problem :
dd bs=1M seek=500000 if=/dev/zero count=1 of=test
1+0 records in
1+0 records out
1048576 bytes transferred in 0.175331 seconds (5980554 bytes/sec)
x@touareg:~/test$ ls -la
drwxrwxr-x 2 x pg1 3 2008-01-14 06:52 .
drwx-----x 21 x pg1 27 2008-01-14 06:52 …
-rw-rw-r-- 1 x pg1 524289048576 2008-01-14 06:52 test
(anonymized) There’s a sparse file with 500gb. It works with anything you throw at it in the shell (i.e. scp works). However, this is just the filesystem side of things. Apache won’t touch anything beyond 2gb in size. I am not sure whether FTP does, as I have never tried.
Running CGI scripts : yes.
Database and PHP lag depend on your configuration. You can severely reduce PHP lag by setting up a PHP-FCGI installation instead of invoking PHP on every request that needs it. Database lag, however, is either there or it isn’t, depending on whether you are grouped together with inconsiderate pricks or not on your database server. If you are, you can get support@ to look into it and either throttle the offending user(s) or have you moved to a different database server.
Mail server has responded speedily so far, though I have to admit I rarely use it.
ftp downtime : Same as serverdowntime or http downtime. I very strongly suggest you use SCP/SFTP instead of FTP.