How to move files from amazon ec2 to s3 bucket using command line

+3 votes

In my amazon EC2 instance, I have a folder named uploads. In this folder I have 1000 images. Now I want to copy all images to my new S3 bucket. How can I do this?

Aug 29, 2018 in AWS by bug_seeker
• 15,350 points
2,518 views

5 answers to this question.

+1 vote

There are multiples ways to achieve this:

1. Uisng sm3cmd

s3cmd get s3://AWS_S3_Bucket/dir/file

Take a look at this s3cmd documentation

if you are on linux, run this on the command line:

sudo apt-get install s3cmd

or Centos, Fedore.

yum install s3cmd

2. Using Cli from amazon

Use sync instead of cp

Syntax:

aws s3 sync <source> <target> [--options]
answered Aug 29, 2018 by Priyaj
• 56,520 points
+1 vote

Hey, 3 ways you can do this:

To Download using AWS S3 CLI :

aws s3 cp s3://WholeBucket LocalFolder --recursive
aws s3 cp s3://Bucket/Folder LocalFolder --recursive

To Download using Code, Use AWS SDK .

To Download using GUI, Use Cyberduck .

Hope it Helps.. 

answered Oct 9, 2018 by Omkar
• 67,380 points
0 votes

I've used a few different methods to copy Amazon S3 data to a local machine, including s3cmd, and by far the easiest is Cyberduck. All you need to do is enter your Amazon credentials and use the simple interface to download / upload/sync any of your buckets/ folders/files.

answered Oct 9, 2018 by Upasana
• 8,530 points
0 votes

Here is some stuff to download all buckets, list them, list their contents.

    //connection string
    private static void dBConnection() {
    app.setAwsCredentials(CONST.getAccessKey(), CONST.getSecretKey());
    conn = new AmazonS3Client(app.getAwsCredentials());
    app.setListOfBuckets(conn.listBuckets());
    System.out.println(CONST.getConnectionSuccessfullMessage());
    }

    private static void downloadBucket() {

    do {
        for (S3ObjectSummary objectSummary : app.getS3Object().getObjectSummaries()) {
            app.setBucketKey(objectSummary.getKey());
            app.setBucketName(objectSummary.getBucketName());
            if(objectSummary.getKey().contains(CONST.getDesiredKey())){
                //DOWNLOAD
                try 
                {
                    s3Client = new AmazonS3Client(new ProfileCredentialsProvider());
                    s3Client.getObject(
                            new GetObjectRequest(app.getBucketName(),app.getBucketKey()),
                            new File(app.getDownloadedBucket())
                            );
                } catch (IOException e) {
                    e.printStackTrace();
                }

                do
                {
                     if(app.getBackUpExist() == true){
                        System.out.println("Converting back up file");
                        app.setCurrentPacsId(objectSummary.getKey());
                        passIn = app.getDataBaseFile();
                        CONVERT= new DataConversion(passIn);
                        System.out.println(CONST.getFileDownloadedMessage());
answered Oct 9, 2018 by Nilesh
• 6,880 points
0 votes

AWS Command Line Tool works much like boto and can be installed using sudo easy_install awscli or sudo pip install awscli

Once installed, you can then simply run:

Command:

aws s3 sync s3://mybucket .

Output:

download: s3://mybucket/test.txt to test.txt 

download: s3://mybucket/test2.txt to test2.txt

This will download all of your files (one-way sync). It will not delete any existing files in your current directory (unless you specify --delete), and it won't change or delete any files on S3.

You can also do S3 bucket to S3 bucket, or local to S3 bucket sync.

Check out the documentation and other examples:

http://docs.aws.amazon.com/cli/latest/reference/s3/sync.html

answered Oct 9, 2018 by anonymous

Related Questions In AWS

0 votes
2 answers

How to configure Amazon EC2 command line interface?

You can use this command to create ...READ MORE

answered Feb 23 in AWS by Shashank
• 1,350 points
80 views
0 votes
2 answers

How to access files in S3 bucket from R?

You can take a look at the ...READ MORE

answered Aug 10, 2018 in AWS by Deepthi
• 300 points
940 views
0 votes
2 answers

How to display just the name of files using aws s3 ls command?

aws s3 ls s3://<your_bucket_name>/ | awk '{print ...READ MORE

answered Mar 17 in AWS by anonymous
3,051 views
+13 votes
2 answers

Git management technique when there are multiple customers and need multiple customization?

Consider this - In 'extended' Git-Flow, (Git-Multi-Flow, ...READ MORE

answered Mar 26, 2018 in DevOps & Agile by DragonLord999
• 8,380 points
145 views
+1 vote
2 answers

AWS CloudWatch Logs in Docker

The awslogs works without using ECS. you need to configure ...READ MORE

answered Sep 6, 2018 in AWS by bug_seeker
• 15,350 points
327 views
0 votes
1 answer

How to copy .csv file from Amazon S3 bucket?

Boto3 is the library to use for ...READ MORE

answered Jul 6, 2018 in AWS by Priyaj
• 56,520 points
380 views
0 votes
1 answer