Upload file using Curl and Bash

Hello, I am trying to adapt a bash script that works with Amazon S3, to upload a file to a bucket, using curl and bash. I can auth and try to send the file, but I am missing something…
Here is the script:
https://raw.githubusercontent.com/fredericofs/s3.bash/master/simple_version_working

And the output of the attempt to upload a text file is:

I tried your bash script and it worked for me. In your output, it appears that it worked for you too. The md5sum in the response is correct for the file I uploaded. Did you refresh the page in the DreamHost panel before checking that the file was there?

The only thing I see that could be off is that the content-length is set to 0 in the response. I had that too but the file uploaded successfully. I’ll do some research tomorrow and find out if that’s expected behavior or not.

Strange as it seems the file was uploaded to a invisible folder, it does not apper on cyberduck, but I can see it on the web interface.

The file name in the example includes a directory “/tmp/testfile.txt” and that whole string is included in the URL. It looks like curl is appending the “/” character from “/tmp” to the URL and DreamObjects sees both slashes and treats the first as a directory with no name.

If you look at the object in the panel, you’ll see the URL would be (note the 2 slashes before “tmp”): https://objects.dreamhost.com/examplebucket//tmp/testfile.txt

You could add something in your script to strip out the extra slash and be all set.

Fredfs: I wonder, why do you use curl and not awscli?

I am using curl to upload a small log file from the machine to the bucket, and it works very nice inside the script.
Also it is nice to have a this handy script to upload some file I need from some machine without a s3 cli.

At first I was using awscli for all my DreamObject needs, but with new projects/machines/system I started to have many issues with awscli, always a new problem, be it locale, authentication, files beign downloaded again and again. I end up finding that botocore/python to be very delicated.
I am now discovering tools for s3 written in Go lang and they seem much more fast, low on memory, stable and error free than botocore/python.

Very cool.

Curl is handy and definitely ubiquitous. Just make sure you set those permissions properly b/c you don’t want anyone else to be able to see your keys!

Feel free to share any other tools or scripts that you find useful.