Server to Server transfer of files

dreamobjects

#1

So is there no tools (Rsync) or php scripts that allow the transferring of files from DH shared to DreamObjects?

I am aware that are some cURL methods, but am totally unfamiliar with cURL.

If there is no way of moving / migrating hundreds of GBs of files from files ALREADY on DH shared to DreamObjects I would think someone really didnt think about a deal breaker… I have a client who is wanting to move to Amazon S3 IF files that are already at DH cant be moved to DreamObjects thats a loss of a D.O. signup AND an already hosting user.


#2

Right now many of the shared hosting machines have a utility called boto-rsync installed on them. This is an rsync-like tool to transfer files to an object store compatible with the S3 API, e.g. DreamObjects.

To make things a little easier, you can create a file ~/.boto with the contents:

[Credentials]
aws_access_key_id = Your_Access_Key
aws_secret_access_key = Your_Secret_Key
[Boto]
s3_host = objects.dreamhost.com

I’d also recommend setting access permissions so only you can read it - chmod 400 ~/.boto

We’re also working on something now to make copying files and backing up your account easy to do from the panel.


#3

Is there some kind of documentation for this? I kinda get the picture of what your saying, but if this possibility is present there should be some docs on it right?


#4

Oh what if the cloud works just like native Linux and SSH:

  1. I can simply upload a 100GB file to my objects directory, or a freshly created directory within
  2. I can control access to the file via .htaccess and .htgroup

Therefore, I can start using it RIGHT NOW! For now I have no clue how I should start, all I see in the docs is a bunch of command lines that only those who are already familiar with S3 or something can use. I’m probably too low end.


#5

BG01 - boto-rsync works similarly to the traditional rsync utility, though not exactly. Here’s the basic usage:

Creating the ~/.boto file with your access key, secret key, and endpoint means you can exclude those when typing in the command:

As an example to backup your home directory to a DreamObjects bucket called bg01, you’d do this:

There are lots of options to boto-rsync, just pass the -h flag.

Object storage is different than traditional file storage and can’t be accessed in quite the same ways. It’s designed to be accessed via API and even easier, files are retrieved via simple URL.

You can upload a 100GB file to your DreamObjects bucket, you will just need to use something that supports the API. The easiest option, IMO, is CrossFTP since it supports uploading large files in pieces. Options are available here - http://dhurl.org/25h.

Access to that file can be controlled by setting the permissions to private and sending an time-expired link to whomever needs it.


#6

Thanks will check it out


#7

Personally, I use the --delete flag - “Delete extraneous files from destination dirs after the transfer has finished (e.g. rsync’s --delete-after).”


#8

I wrote something that might help you out. Check out my post about a file lister for objects in a DreamObjects bucket. https://discussion.dreamhost.com/thread-138874.html


#9

OK, my server doesn’t have boto-rsync installed. I’m reluctant to just install software on someone else’s server. Is there some other alternative, or can I just install boto from the python sources (pip doesn’t exist, either).

Thanks,
John


#10

John, I ended up using this: http://s3tools.org/s3cmd-sync

All I can tell you is it works PERFECTLY. It does call for you to set it up on server, but after you do… it works like a charm.
And believe me i looked around for any and all kinds of tools / methods, and this one fit the bill for me as number one requirement was something easy to use lol! I was so desperate that even paid for this garbage service http://www.s3rsync.com/. Dont waste your time or money, BUT had it worked I would have used it for the GIGS I had to move quickly.

The s3tools command line tool does ALOT of things. You do have to install on server, it’s security is good as it requires keys, etc… you can even lock it to your own IP ONLY. Then from there they have a damn command for just about anything you need to do. I was doing a 3 way transfer… from my local system, leave a copy on linux server and then put a copy on S3. All the while it does it bit for bit, no loss packets, etc. Syncing is a breeze, also their checksum is bit for bit accurate.


#11

Thanks BG01. The archive doesn’t seem to exist any more - I think it migrated to GIT and I can’t find it.

Any clue?

Thanks again
John


#12

Their page here:
s3tools.ohttp://sourceforge.net/projects/s3tools/files/s3cmd/1.5.0-alpha1/#filesrg/download

Leads to here to download:
http://sourceforge.net/projects/s3tools/files/s3cmd/1.5.0-alpha1/#files


#13

Thank you - I’ll give it a go

John