Same problem as others: application killed

So, I have read about this problem pretty much in every other thread in this subforum on programming, and I too have the same problem. I run a Python script (the language at least is different), which takes some data, manipulates it, and outputs a file. In theory. In practice if the input file is too large it only outputs part of the result, because the process gets killed. And since we are speaking of an xml file, half a file is no good at all. Like nothing at all.

Is there any way around this. It is fairly important. I cannot run it on my computer, as it is part of the service that I am trying to build on my website. I tried to ‘nice’ it, but maybe I wasn’t nice enough, as it got killed again… on exactly the same point. Thus I assume it must be a problem with the memory.

Is there anything else that can be done? Is dreamhost available to be bribed, we pay more, and you give us more ram space. Or similar. I considered making the whole program really into basic steps, all stored in the directory, and jusy pick up the data for one process, make one calculation, and store the new data. But it is a massive work, and at the end it will be way more disrutive for dreamhost. Plus it would require make some sort of cron job work every minute to check if there is any work that needs to be done.

Anybody have other suggestions?
Dreamhost seem to offer such a good hosting service, but if it is not possible to scale up in this respect it is actually really limited.

Any suggestion would be appreciated.
Many, thanks,

If you’ve niced it and you’ve trimmed back the memory usage as much as possible (and I must warn you that DH memory usage restriction is TIGHT to ensure that everyone gets some) then you may be running into a limit that is incompatible with DH’s shared hosting.

I would suspect that you’ve already contacted DH support on this but if you haven’t you probably should. Maybe they’ll be kind enough or knowledgeable enough to give you a hint to get around the limitations. (I do suspect that you will be told that it’s beyond their ability to help.)

You may try asking the python community if there is another way to perform the action you are trying to do that uses less memory or cpu. Surely others have dealt with similar problems.

Wholly - Use promo code WhollyMindless for full 97$ credit. Let me know if you want something else!

Thanks for your answer. I made some extra test and I discovered that:
a) the output file get produced up to 64kb exactly.
b) the process runs to the end, it’s just that file doesn’t get out
c) Once the 64 kb limit is reached no other file can be written at all. It is as if the limit was: no process should be allowed to write more than 64 kb of data in one or more files on the disk. I do remember there is a similar limit on how much data can be set in coockies.

So it seems more than something to defend others against me, something to defent me against malicious files installed in my directory by others. Especially since I have plenty of extra space.

So now I am considering trying to look around if there are ways to run the process so that it is not so limited, and if the whole thing can be bypassed by writing directly to MySql which was in any case in my plan.

Again, as before, any suggestion appreciated. At this point I am also writing as a reference, as other people might incur in the same problem.

Thanks again Wholly


I certainly wouldn’t expect a 64k output limit to be an actual limitation as we wouldn’t be able to write anything to disk. But research is good and gives us something to consider.

Any chance this could be a python configuration limit? Not many of us use it and we just might not have noticed.

The wiki entry on python doesn’t describe any limitations but it might give you something to work with.

Sorry that I jumped in here without anything solid but it didn’t seem like anyone else was helping with the thought process.

Wholly - Use promo code WhollyMindless for full 97$ credit. Let me know if you want something else!

yes, I read the wiki entry. And before being on dreamhost I was on another hosting… and on them too there was a similar limit. AT the time I did not thought of looking at the actual size of what was written, but it seems that the same process that go silent here, were going silent there, too. So it might very well be a python limit. But then it might be a sort of standard configuration limit that is used when python is offered as part of a hosting solution.

In any case right now I am trying to get it to connect directly to mysql, as this would also shortcut a series of other unelegances in the code.

Thanks for keep on stirring it up, it does help.

Good luck!

Wholly - Use promo code WhollyMindless for full 97$ credit. Let me know if you want something else!

I have run Python programs that generate >64K of output, so I don’t think it’s a Python limit. If the program is getting killed abnormally (which has happened to me) then it’s possible that some output was buffered in memory and never written to disk. If you’re writing to disk using a Python file object, then you can try calling the flush() method and see if you might get more of the output.


Hello Larry, thanks for the answer.

Yes, I should have mentioned it, I do flush(), and I close(). I also tried to flush() more often, but it makes no difference.

You say you have run python programs that generate more than 64K of output.
Was this output written to a file?
Was the python on dreamhost?

Because the program seems to run fine when I run it on my computer. And it generates way more than 64K of data. And also the program keep on outputting on the screen when I run it from dreamhost. So strictly speaking it does output more than 64k. The limit seem to be on the data sent on file format.

Many thanks,

It turns out that the program I was thinking of didn’t write directly to a file; it wrote to stdout, which was redirected to a file. (It was running on DH.) I whipped up the following small program, which ran fine on DH and generated a 640K output file:

[code]ln = (’*’ * 63) + ‘\n’ # 64 bytes

f = open(‘out’, ‘w’)
for i in xrange(10240): # 10K
[/code]Changing 10240 to 102400 resulted in a 6+ MB file, and it stil ran in less than 1 sec.