I have a site that is like a shopping where the products displayed come from xml files. These xml files are provided by another website, and I need to download them to my server once a week (at least). There are almost 11500 files.
So, I made a script in Python that download all these 11500 xml files over http and save them to my server. The script uses threads, and until last month it was running ok. But now, everytime I execute it, the script get killed. I tried reducing the number of threads, downloading only a few files each time, etc. but nothing seems to work.
Is there anything I can do to avoing this problem? What are the limitations of the server (is it the running time or the amount of processing that is too much)?
Thanks for any tips.