So is there no tools (Rsync) or php scripts that allow the transferring of files from DH shared to DreamObjects?
I am aware that are some cURL methods, but am totally unfamiliar with cURL.
If there is no way of moving / migrating hundreds of GBs of files from files ALREADY on DH shared to DreamObjects I would think someone really didnt think about a deal breaker… I have a client who is wanting to move to Amazon S3 IF files that are already at DH cant be moved to DreamObjects thats a loss of a D.O. signup AND an already hosting user.
Right now many of the shared hosting machines have a utility called boto-rsync installed on them. This is an rsync-like tool to transfer files to an object store compatible with the S3 API, e.g. DreamObjects.
To make things a little easier, you can create a file ~/.boto with the contents:
Oh what if the cloud works just like native Linux and SSH:
I can simply upload a 100GB file to my objects directory, or a freshly created directory within
I can control access to the file via .htaccess and .htgroup
Therefore, I can start using it RIGHT NOW! For now I have no clue how I should start, all I see in the docs is a bunch of command lines that only those who are already familiar with S3 or something can use. I’m probably too low end.
BG01 - boto-rsync works similarly to the traditional rsync utility, though not exactly. Here’s the basic usage:
Creating the ~/.boto file with your access key, secret key, and endpoint means you can exclude those when typing in the command:
As an example to backup your home directory to a DreamObjects bucket called bg01, you’d do this:
There are lots of options to boto-rsync, just pass the -h flag.
Object storage is different than traditional file storage and can’t be accessed in quite the same ways. It’s designed to be accessed via API and even easier, files are retrieved via simple URL.
You can upload a 100GB file to your DreamObjects bucket, you will just need to use something that supports the API. The easiest option, IMO, is CrossFTP since it supports uploading large files in pieces. Options are available here - http://dhurl.org/25h.
Access to that file can be controlled by setting the permissions to private and sending an time-expired link to whomever needs it.
OK, my server doesn’t have boto-rsync installed. I’m reluctant to just install software on someone else’s server. Is there some other alternative, or can I just install boto from the python sources (pip doesn’t exist, either).
All I can tell you is it works PERFECTLY. It does call for you to set it up on server, but after you do… it works like a charm.
And believe me i looked around for any and all kinds of tools / methods, and this one fit the bill for me as number one requirement was something easy to use lol! I was so desperate that even paid for this garbage service http://www.s3rsync.com/. Dont waste your time or money, BUT had it worked I would have used it for the GIGS I had to move quickly.
The s3tools command line tool does ALOT of things. You do have to install on server, it’s security is good as it requires keys, etc… you can even lock it to your own IP ONLY. Then from there they have a damn command for just about anything you need to do. I was doing a 3 way transfer… from my local system, leave a copy on linux server and then put a copy on S3. All the while it does it bit for bit, no loss packets, etc. Syncing is a breeze, also their checksum is bit for bit accurate.