public-ish write access to S3 bucket with file size limiting

0 votes

I would like to provide a public-ish write access to an S3 bucket, but with the option to limit the upload size.

Enabling public write access to the bucket via bucket policies is not enough, because there's no such s3 condition to limit the upload size.

Another recommended solution is to use signed POST policy, which has content-length-range, but I cannot use this directly, because the policy requires an expiretaion date, which will obviously change for each request. I cannot deploy the (desktop) application with actual credentials, which means I cannot sign the policies.

After further study I see the following options:

  • a) create a lambda endpoint to which I upload the file

    The lambda function would verify the file size, and copy the file to S3 bucket. The file size limit is small (~MBs at most), so lambda execution time limit shouldn't be a problem.

  • b) create a lambda endpoint that generates signed POST policy

    Uploading file would then result in two requests: (1) make a GET request to get a new signed policy, and (2) upload file directly to S3 using the retrieved policy.

  • c) create a lambda endpoint that generates pre-signed URLs

    Pretty much same as b).


  • What would be the recommended way to approach / what are the pros/cons of each approach?

  • Is there a practical difference between using signed POST policies and pre-signed URLs? Especially considering the pre-signed URL uses the same POST policy anyway.

Sep 19, 2018 in AWS by bug_seeker
• 15,350 points

1 answer to this question.

0 votes

You can specify the content length in Presigned post,

Looks for Specifying Ranges

We have solved by using lambda which generates pre-signed url for the post.

Browser -- Lambda (PreSignedPost) -- Browser -- Post it to S3

If the length range (between minimum and maximum) does not match post will fail.

Hope it helps.

answered Sep 19, 2018 by Priyaj
• 56,520 points

Related Questions In AWS

0 votes
1 answer

How do I write an S3 Object to a file?

While IOUtils.copy() and IOUtils.copyLarge() are great, I would prefer the old ...READ MORE

answered Jul 13, 2018 in AWS by Hammer
• 360 points
0 votes
2 answers

How to access files in S3 bucket from R?

You can take a look at the ...READ MORE

answered Aug 10, 2018 in AWS by Deepthi
• 300 points
0 votes
1 answer
0 votes
1 answer

how to access AWS S3 from Lambda in VPC

With boto3, the S3 urls are virtual by default, ...READ MORE

answered Sep 28, 2018 in AWS by Priyaj
• 56,520 points
0 votes
2 answers
0 votes
1 answer

How to copy .csv file from Amazon S3 bucket?

Boto3 is the library to use for ...READ MORE

answered Jul 6, 2018 in AWS by Priyaj
• 56,520 points