How to do parallel uploads to the same s3 bucket directory with s3cmd?

+4 votes

I try to run the following code but index.js turns out to be corrupted.

Any idea why?

gzip dist/production/index.js
mv dist/production/index.js.gz dist/production/index.js

s3cmd --access_key="$S3_ACCESS_KEY" --secret_key="$S3_SECRET_KEY" \
      --acl-public --no-mime-magic --progress --recursive         \
      --exclude "dist/production/index.js" \
      put dist/production/ 
      "s3://${BUCKET}/something/${BUILD_IDENTIFIER}/production/" &

s3cmd --access_key="$S3_ACCESS_KEY" --secret_key="$S3_SECRET_KEY" \
      --acl-public --no-mime-magic --progress --recursive         \
      --add-header="Content-Encoding:gzip" \
      put dist/production/index.js 
      "s3://${BUCKET}/something/${BUILD_IDENTIFIER}/production/" &


If the upload is not parallel, the uploading process seems to turn out fine.

Mar 27, 2018 in Cloud Computing by hemant
• 5,750 points

5 answers to this question.

Your answer

Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
+2 votes

If you are doing it on purpose why not maintain the gz extension. Extensions does lot of things when you handle with browser.

cp dist/production/index.js.gz dist/production/index.js.gz

If you use plain s3 to download and verify the hash, they should be the same file. I did verify it.

answered Aug 13, 2018 by bug_seeker
• 14,980 points
+2 votes

Is your problem is not with the line?

cp dist/production/index.js.gz dist/production/index.js

You are copying gzipped file, not the plain index.js text file.

Hope it helps.

answered Oct 11, 2018 by findingbugs
• 4,730 points
0 votes

You can try using this :

s3upload_single() {
    extension=`file $n | cut -d ' ' -f2 | awk '{print tolower($0)}'` 
    mimetype=`file --mime-type $n | cut -d ' ' -f2`
    fullpath=`readlink -f $n`



    s3upload="s3cmd put -m $mimetype --acl-public $fullpath s3://tff-xenforo-data"$filePathWithExtensionChanged     

    echo $response 
export -f s3upload_single
find -name "*.data" | parallel s3upload_single
answered Oct 16, 2018 by Priyaj
• 56,140 points
+1 vote

you can just use s3cmd-modified which allows you to put/get/sync with multiple workers in parallel

$ git clone
$ cd s3cmd-modification
$ python install
$ s3cmd --parallel --workers=4 sync /source/path s3://target/path
answered Oct 16, 2018 by Rohan
+1 vote

Use aws cli. It supports parallel upload of files and it is really fast while uploading and downloading.

answered Oct 16, 2018 by abc

Related Questions In Cloud Computing

0 votes
1 answer

Is there a way to find out the number of objects stored in an S3 bucket?

Frankly, that is not possible. This is ...READ MORE

answered Apr 17, 2018 in Cloud Computing by code_ninja
• 5,900 points
0 votes
1 answer
+1 vote
1 answer
0 votes
1 answer

How to delete the files from AWS S3 except the recently 5 added/updated files?

Use AWS s3 rm command with multiple --exclude options as shown: aws ...READ MORE

answered Jul 30, 2018 in Cloud Computing by Gopalan
• 1,260 points
+4 votes
2 answers

Amazon S3 Permission problem - How to set permissions for all files at once?

This can be done by setting the ...READ MORE

answered Mar 27, 2018 in Cloud Computing by brat_1
• 7,080 points
0 votes
3 answers

Using EC2 instance to access S3 bucket locally

You can use the boto library to ...READ MORE

answered Nov 30, 2018 in Cloud Computing by Aniket
0 votes
1 answer

What is the difference between AWS S3 Bucket Log Vs AWS CloudTrail

Let me give you a few pointers: In ...READ MORE

answered Apr 19, 2018 in Cloud Computing by brat_1
• 7,080 points
0 votes
1 answer

Is there a way to use websockets for streaming S3 bucket?

I believe that’s not possible natively in ...READ MORE

answered May 10, 2018 in Cloud Computing by DragonLord999
• 8,360 points
+1 vote
3 answers

How to upload files on aws elastic beanstalk?

yes once you store it in (AWS) ...READ MORE

answered Sep 3, 2018 in Cloud Computing by bug_seeker
• 14,980 points
0 votes
1 answer

AWS: Having trouble serving files with S3

Why go to the trouble of EC2 ...READ MORE

answered Aug 13, 2018 in Cloud Computing by bug_seeker
• 14,980 points

© 2018 Brain4ce Education Solutions Pvt. Ltd. All rights Reserved.
"PMP®","PMI®", "PMI-ACP®" and "PMBOK®" are registered marks of the Project Management Institute, Inc. MongoDB®, Mongo and the leaf logo are the registered trademarks of MongoDB, Inc.