Well, it can be done, do a greater or lesser degree, depending upon your exact structure but it can be complicated and may, or may not, work well with your current procedures.
Assuming that the firm requirements are 1) that FTP be used to accomplish the uploading and 2) these uploads are to ultimately be made available as part of the “site”, then I think your best bet would be to use DreamHost’s Remap a Sub-dir process to accomplish this (See the third paragraph in the linked section of the Wiki page). This allows a given FTP user to have access to a given sub-dir of a website (the dir residing below the main domains “web root”). There are some potential “gotchas” here though, that you should be aware of.
First, convenience of management would probably dictate that the “master” account user use the given users’ FTP credentials to access those sub-dirs (to prevent permissions issues from developing). This can be worked around by clever manipulation of user groups/permissions (maybe cron based?) but doing so would likely be problematic in “real world” usage.
Secondly, given DreamHost’s use of suEXEC, CGI and PHP-CGI will not work in these sub-dirs (as noted in the wiki article).
If neither of these issues is a “deal breaker” for you, then that is probably the way to go.
That said, if the use of FTP (or, preferably SFTP for security reasons) is not an absolute requirement, you might consider an alternative approach. By providing a script-based uploading tool (or even a “filemanager” type application) accessible by users via a browser, you can eliminate the multiple users management issues and retain a single user (yourself, or the “master” user) as “owner” of all the files.
This “plays well with others” in that there are no suEXEC permissions issues, and you can manage all the files conveniently from your own user’s account without having to jump through hoops.
There are many such programs readily and “freely” available that can provide this function (with varying degrees of quality, flexibility, and security); one that I use, and recommend, is uber-uploader. This is particularly nice in that it is focused on the single function of facilitating uploads, and because of the way in which it uses both PHP and perl, it is not constrained by a given PHP environment’s max_upload_filesize settings.
By installing a copy of this for the exclusive use of each department (protected by .htaccess based apache authentication), and having each department’s copy place files only in that department’s sub-dir, you can allow convenient uploading to the appropriate directories while preventing “up-tree” exploring.
This is the method I use in similar situations, and I have found it to work very well - most users needing to upload find the web-based upload interface to be easier to use than FTP, and they are far less likely, in my experience, to “bork” stuff up that way than they are using FTP. I also think it’s more secure.
Does that help at all, or did I only confuse you further?