Getting s3 objects with Rails

Posted on

Problem

I have been using aws-sdk gem with Rails, for uploading and downloading objects. I believe it’s working fine.

Demo code:

begin

s3 = AWS::S3.new
bucket = s3.buckets[ENV['BUCKET_NAME']]
bucket.objects.create(fname, data)
object = bucket.objects[fname]
exported_url   =  object.url_for(:get, {expires: 3.weeks,
             response_content_type: "text/csv", response_content_disposition: 
"attachment; filename=#{fname}.csv"}).to_s
rescue Aws::S3::Errors::ServiceError => e
 puts e.message
end

Are there any issues if uploading a large file to s3? I mean, that issues will affect my code. Will it handle that situation for large files? Should I go for the multipart options in s3?

Solution

There are two issues you should watch out for:

  1. This will block the current thread while it’s running. For a large file, this could be a substantial amount of time. You should consider moving this code out to a Sidekiq (or similar) worker, so that you aren’t blocking threads in your web app.
  2. When you rescue exceptions, you just puts them, which you’re likely to ignore in production. You should be logging that exception to a place that you can get ahold of it later. I would personally just not rescue it and use BugSnag or similar in my application to ensure that errors are collected and kept for resolution.

Leave a Reply

Your email address will not be published. Required fields are marked *