How to do parallel uploads to the same s3 bucket directory with s3cmd

+4 votes

I try to run the following code but index.js turns out to be corrupted.

Any idea why?

gzip dist/production/index.js
mv dist/production/index.js.gz dist/production/index.js

s3cmd --access_key="$S3_ACCESS_KEY" --secret_key="$S3_SECRET_KEY" \
      --acl-public --no-mime-magic --progress --recursive         \
      --exclude "dist/production/index.js" \
      put dist/production/ 
      "s3://${BUCKET}/something/${BUILD_IDENTIFIER}/production/" &

s3cmd --access_key="$S3_ACCESS_KEY" --secret_key="$S3_SECRET_KEY" \
      --acl-public --no-mime-magic --progress --recursive         \
      --add-header="Content-Encoding:gzip" \
      put dist/production/index.js 
      "s3://${BUCKET}/something/${BUILD_IDENTIFIER}/production/" &

wait


If the upload is not parallel, the uploading process seems to turn out fine.

Mar 27, 2018 in Cloud Computing by hemant
• 5,790 points
4,113 views

5 answers to this question.

+2 votes

If you are doing it on purpose why not maintain the gz extension. Extensions does lot of things when you handle with browser.

cp dist/production/index.js.gz dist/production/index.js.gz

If you use plain s3 to download and verify the hash, they should be the same file. I did verify it.

answered Aug 13, 2018 by bug_seeker
• 15,520 points
+2 votes

Is your problem is not with the line?

cp dist/production/index.js.gz dist/production/index.js

You are copying gzipped file, not the plain index.js text file.

Hope it helps.

answered Oct 11, 2018 by findingbugs
• 4,780 points
0 votes

You can try using this :

s3upload_single() {
    n=$1
    data=".data" 
    extension=`file $n | cut -d ' ' -f2 | awk '{print tolower($0)}'` 
    mimetype=`file --mime-type $n | cut -d ' ' -f2`
    fullpath=`readlink -f $n`

    changed="${fullpath/.data/.$extension}"

    filePathWithExtensionChanged=${changed#*internal_data}

    s3upload="s3cmd put -m $mimetype --acl-public $fullpath s3://tff-xenforo-data"$filePathWithExtensionChanged     

    response=`$s3upload`
    echo $response 
}
export -f s3upload_single
find -name "*.data" | parallel s3upload_single
answered Oct 16, 2018 by Priyaj
• 58,090 points
+1 vote

you can just use s3cmd-modified which allows you to put/get/sync with multiple workers in parallel

$ git clone https://github.com/pcorliss/s3cmd-modification.git
$ cd s3cmd-modification
$ python setup.py install
$ s3cmd --parallel --workers=4 sync /source/path s3://target/path
answered Oct 16, 2018 by Rohan
+1 vote

Use aws cli. It supports parallel upload of files and it is really fast while uploading and downloading.

http://docs.aws.amazon.com/cli/latest/reference/s3/

answered Oct 16, 2018 by abc

Related Questions In Cloud Computing

0 votes
1 answer

Is there a way to find out the number of objects stored in an S3 bucket?

Frankly, that is not possible. This is ...READ MORE

answered Apr 17, 2018 in Cloud Computing by code_ninja
• 6,290 points
624 views
0 votes
1 answer
+1 vote
1 answer
0 votes
1 answer

How to delete the files from AWS S3 except the recently 5 added/updated files?

Use AWS s3 rm command with multiple --exclude options as shown: aws ...READ MORE

answered Jul 30, 2018 in Cloud Computing by Gopalan
• 1,360 points
3,545 views
+4 votes
2 answers

Amazon S3 Permission problem - How to set permissions for all files at once?

This can be done by setting the ...READ MORE

answered Mar 27, 2018 in Cloud Computing by brat_1
• 7,200 points
3,492 views
0 votes
3 answers

Using EC2 instance to access S3 bucket locally

You can use the boto library to ...READ MORE

answered Nov 30, 2018 in Cloud Computing by Aniket
3,120 views
0 votes
1 answer

What is the difference between AWS S3 Bucket Log Vs AWS CloudTrail

Let me give you a few pointers: In ...READ MORE

answered Apr 19, 2018 in Cloud Computing by brat_1
• 7,200 points
1,225 views
0 votes
1 answer

Is there a way to use websockets for streaming S3 bucket?

I believe that’s not possible natively in ...READ MORE

answered May 10, 2018 in Cloud Computing by DragonLord999
• 8,450 points
1,833 views
+1 vote
3 answers

How to upload files on aws elastic beanstalk?

yes once you store it in (AWS) ...READ MORE

answered Sep 3, 2018 in Cloud Computing by bug_seeker
• 15,520 points
6,259 views
0 votes
1 answer

AWS: Having trouble serving files with S3

Why go to the trouble of EC2 ...READ MORE

answered Aug 13, 2018 in Cloud Computing by bug_seeker
• 15,520 points
428 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP