Upload multiple files AND large files

dreamobjects

#21

thank you very much. I will fwd this to my dev and see if this helps. are you able to do large chunk/multi-part uploads too? That is the limitation we have encountered so far. We can do multi file uploads, with a single file size limit of 50mb, but that sacrifices being able to upload single large files up to 2gb (the file size limit we have encountered for single large files).


#22

The 2G limit is PHP, but I was able to upload a 400m file this way very recently.


#23

you mean 400mb with multi file upload? are you able to do any large single file uploads, along with the multi file uploads? We have not been able to get both working together.


#24

best of luck,
note : this code works on amazon’s S3 server.
as soon as i add metadata it throws the error, I posted earlier in this thread… with out metadata, it also works on dreamobjects.

below is code we are using for multiple uploads. it is working, but we need to send meta data array(‘x-amz-meta-uuid’=>‘14365123651274’)
AS SOON AS I ADD THIS LINE array(‘x-amz-meta-uuid’=>‘14365123651274’) IT GIVES THE ERROR … AND THIS CODE IS WORKING WITH THE AWS S3 SERVER.

<?php /* In order to upload files to S3 using Flash runtime, one should start by placing crossdomain.xml into the bucket. crossdomain.xml can be as simple as this: <?xml version="1.0"?>

In our tests SilverLight didn’t require anything special and worked with this configuration just fine. It may fail back
to the same crossdomain.xml as last resort.

!!!Important!!! Plupload UI Widget here, is used only for demo purposes and is not required for uploading to S3.
*/

// important variables that will be used throughout this example
$bucket = ‘user-collabtest-22’;

// these can be found on your Account page, under Security Credentials > Access Keys
$accessKeyId = ‘xxxxxxxxxxxxxxxxxxx’;
$secret = ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’;

// prepare policy
$policy = base64_encode(json_encode(array(
// ISO 8601 - date(‘c’); generates uncompatible date, so better do it manually
’expiration’ => date(‘Y-m-d\TH:i:s.000\Z’, strtotime(’+1 day’)),
‘conditions’ => array(
array(‘bucket’ => $bucket),
array(‘acl’ => ‘public-read’),
array(‘starts-with’, ‘$key’, ‘’),
// for demo purposes we are accepting only images
array(‘starts-with’, ‘$Content-Type’, ‘image/’),
// Plupload internally adds name field, so we need to mention it here
array(‘starts-with’, ‘$name’, ‘’),
// One more field to take into account: Filename - gets silently sent by FileReference.upload() in Flash
// http://docs.amazonwebservices.com/AmazonS3/latest/dev/HTTPPOSTFlash.html
array(‘x-amz-meta-uuid’=>‘14365123651274’),

array(‘starts-with’, ‘$Filename’, ‘’),
)
)));

// sign policy
$signature = base64_encode(hash_hmac(‘sha1’, $policy, $secret, true));

?>

Plupload to Amazon S3 Example

Plupload to Amazon S3 Example

Your browser doesn't have Flash, Silverlight or HTML5 support.


#25

Hi sadie8686,

It looks like you’re getting an error because you’re setting the policy document to specifically require the uuid meta header but it’s never being submitted by your form.

The policy document is a set of conditions that must be met to allow the upload to happen. When you say “array(‘x-amz-meta-uuid’=>‘14365123651274’)” in your policy document, that means you’re requiring the x-amz-meta-uuid field to be set exactly to 14365123651274 when the upload request happens.

In the example code, I don’t see anywhere where the x-amz-meta-uuid field is being set to 14365123651274 and so it fails the policy check.

I wonder if AWS S3 is not being as strict as their spec says and DreamObjects is following the specification guidelines to the letter.

Edit:
See the section on the sample form and you can see where there is a field for setting the x-amz-meta-uuid tag in a hidden field - http://docs.aws.amazon.com/AmazonS3/latest/API/sigv4-post-example.html


#26

OK, we will be testing this out, but keep in mind, we still need to test out the large upload of a single file too. Remember, we had to sacrifice the large single file uploads to get the multi-file uploads. we had multi-file uploads working with a 50mb file limit for any of those files in the group, but we could no longer do the large single file chunk/multi-part upload up to 2gb.


#27

we are still having issues uploading large files. we can do the direct to server upload method with meta data, per your feedback, but the large chunk/multi-part uploads is not working yet.
[hr]
again, same issue as before- chunk upload not working; even though, we got multiple uploads to work via your direct to server method with meta data. what gives? do you all actually know how to get both to work, without sacrificing one or no?


#28

I see from your post on July 3rd that you’re using plupload. I just downloaded the 2.1.2 release of plupload and copied it to my web server. I modified the examples/jquery/s3.php file so that it had my keys, bucket name, DreamObjects URL and a removed the size and file type restrictions. With that, I was able to upload multiple files to DreamObjects with two of them over 50mb.

Multiple files selected - http://screencaps.objects.dreamhost.com/07-10-2014-14-32-23.png

Upload in progress - http://screencaps.objects.dreamhost.com/07-10-2014-14-36-18.png

Success - http://screencaps.objects.dreamhost.com/07-10-2014-14-39-43.png

It looks to me like the functionality for uploading multiple files and chunked/multipart uploads is working as expected. Perhaps there’s an issue with how you’re setting the metadata that’s causing problems?

Here’s what my s3.php file looks like (without my keys):

[php]

<?php /* In order to upload files to S3 using Flash runtime, one should start by placing crossdomain.xml into the bucket. crossdomain.xml can be as simple as this: <?xml version="1.0"?>

In our tests SilverLight didn’t require anything special and worked with this configuration just fine. It may fail back
to the same crossdomain.xml as last resort.

!!!Important!!! Plupload UI Widget here, is used only for demo purposes and is not required for uploading to S3.
*/

// important variables that will be used throughout this example
$bucket = ‘justin-demo’;

// these can be found on your Account page, under Security Credentials > Access Keys
$accessKeyId = ‘Access_Key’;
$secret = ‘Secret_Key’;

// prepare policy
$policy = base64_encode(json_encode(array(
// ISO 8601 - date(‘c’); generates uncompatible date, so better do it manually
’expiration’ => date(‘Y-m-d\TH:i:s.000\Z’, strtotime(’+1 day’)),
‘conditions’ => array(
array(‘bucket’ => $bucket),
array(‘acl’ => ‘public-read’),
array(‘starts-with’, ‘$key’, ‘’),
// for demo purposes we are accepting only images
array(‘starts-with’, ‘$Content-Type’, ‘’),
// Plupload internally adds name field, so we need to mention it here
array(‘starts-with’, ‘$name’, ‘’),
// One more field to take into account: Filename - gets silently sent by FileReference.upload() in Flash
// http://docs.amazonwebservices.com/AmazonS3/latest/dev/HTTPPOSTFlash.html
array(‘starts-with’, ‘$Filename’, ‘’),
)
)));

// sign policy
$signature = base64_encode(hash_hmac(‘sha1’, $policy, $secret, true));

?>

Plupload to Amazon S3 Example

Plupload to Amazon S3 Example

Your browser doesn't have Flash, Silverlight or HTML5 support.

[/php]

Edit 2016-11-02: Using a crossdomain.xml file is unsupported. Instead, you’ll want to specify a CORS config following these instructions from our knowledge base.


#29

Ok, but my dev said he can do multiple uploads too. Can you show me a success of you uploading a single large file, say 1-2gb, with that same setup?
[hr]
the 200mb file upload with multiple uploads was cool. let’s see if you can upload a file over 1gb now. what is the max size limit by the way for a single large file upload? we previously encountered 2gb. That is what we are having issues with now. How can we let users also upload one large file, say a zip file that is 1gb-2gb or even larger, and still have the multiple file upload option, that you just illustrated. we are currently unable to give users both options.


#30

The max upload size depends on your chunk size. You can have 10,000 chunks/parts so the max upload size is chunk size X 10,000.

Trying to upload a 4GB file. Will let you know if and when it finishes.

http://screencaps.objects.dreamhost.com/07-10-2014-16-24-39.png


#31

I really appreciate the help!!! Thank you guys for looking into this so quickly.


#32

And it just finished successfully.

http://screencaps.objects.dreamhost.com/07-10-2014-17-17-06.png

And here’s the the file in DreamObjects - http://screencaps.objects.dreamhost.com/07-10-2014-17-26-35.png


#33

Ok, A) I will show all this to my dev now…B) any ideas on what he may be doing wrong? We are hosting with another host. Are there any settings, you can think of?


#34

it seems to be working now. i will update you soon.


#35

Oops, missed your message from Thursday. I’m glad it’s working for you now!


#36

I emailed support some things. I didn’t want to post files here. check with them. everything works on my Mac, and I’m now checking Windows. My dev in India, has Linux and Windows working but Mac doesn’t work for him.


#37

FWIW, the uploads I posted screen shots from were on a Mac using Safari 7.0.5


#38

so, on my windows, the multiple upload gives an error saying- upload url may be wrong. the big upload starts but freezes. for my dev in india, for both multi and large uploads, windows and linux works but mac doesn’t do either. for me, mac does both but on windows the multi doesn’t work, at least it’s uploading now, but I get a windows page unresponsive error.
[hr]
this time the large file worked fast for me on windows. i guess it was a connection issue. multiple files still does not work for me on windows. everything works on mac. do you need me to post my message with the screens and code here from my dev, or did support forward them to you?


#39

I got them. I have a few meetings today but will get a chance to take a look this afternoon.


#40

Summary:

Me: my mac tested fine for both on the test url, I emailed you. On windows, the multiple upload does not work for me and the large does not work now (I mistakenly thought it worked…I posted details below).

My dev: Linux and Windows= everything works for him. OS X= nothing works for him. I fwd his screen shots and code in the email you have.
[hr]
fyi, it failed and gave the same error on Windows for the large file for me too. it was going the slower speed it should be going, just like my mac, but then when I tried again, it started uploading abnormally fast, and when it does the fast upload, it hits 100% but says uploaded 0/1 files.

the large file is really 1.09gb but it uploads at 1gb. any ideas why?