vendredi 15 janvier 2016

Large AWS uploads failing after random amount of time

With Rails, I've followed this Heroku article to allow direct file uploading to an S3 Bucket. I actually followed this article because my previous implementation wasn't working for multipart uploads (so, large files). Once I implemented this method large files uploaded just fine, except for really large files.

I should note that I strayed from this article a little, in that I'm using v1 of the aws gem, because of our Rails version.

This is how I'm set up:

S3_BUCKET = AWS::S3.new.buckets[ENV['S3_BUCKET_NAME']]

def set_s3_post_url
  @s3_media_post_url = S3_BUCKET.presigned_post(key: "product_media/#{SecureRandom.uuid}-${filename}", success_action_status: '201', acl: 'public-read')
end

As mentioned, this works for large files (~1GB), but when I try to upload one that is, say 10GB, it gets to a mostly uploaded state, then randomly fails. Sometimes after 20 minutes, sometimes after an hour. I thought that maybe the signed URL was expiring, so I explicitly set a long expiry with expires: Time.now + 4.hours, but that didn't seem to work.

I would really appreciate some help with this if anyone has any ideas!

Aucun commentaire:

Enregistrer un commentaire