Boto3 - python script to view all directories and files

0 votes

I tried to follow the Boto3 examples, but can literally only manage to get the very basic listing of all my S3 buckets via the example they give:

import boto3
s3 = boto3.resource('s3')
for bucket in s3.buckets.all():
    print(bucket.name)

I cannot find documentation that explains how I would be able to traverse or change into folders and then access individual files.

I'm trying to get to my SNS delivery reports, which are stored in a folder for each day of the month - so it is a pain to manually have to download each file for the month and then to concatenate the contents of each file in order to get the count of all SMS messages sent for a month.

Does anyone have an example of a script that can help me with this, or pointers to really basic documentation/examples of helping me to do this?

I have 3 S3 buckets, and all the files are located in sub folders in one of them:

bucketname
|->Year
  |->Month
     |->Day1
     |->Day2
     |->Day3 
     |->Day4

etc etc Underneath the "Day" folder is a single text file called 001.txt SO I am trying to concatenate all the 001.txt files for each day of a month and then find the rowcount of the concatenated text file - which would give me the count of all SMS sent - successful and failed.

Any help, much appreciated.

Sep 17, 2018 in AWS by bug_seeker
• 15,350 points
339 views

1 answer to this question.

0 votes

There are no folders, only S3 object keys.

Using the Bucket Resource interface, you can filter the list of objects in a bucket using the objects collection filter() method (see example).

You can also use the Client interface to call list_objects() with a suitable prefix and delimiter to retrieve subsets of objects.

See Listing Keys Hierarchically for a high-level description.

answered Sep 17, 2018 by Priyaj
• 56,540 points

Related Questions In AWS

0 votes
1 answer

Wants to transfer files using python

First install pyformulas pip install pyformulas==0.2.7 Then run the ...READ MORE

answered Jun 1, 2018 in AWS by Flying geek
• 3,150 points
76 views
+1 vote
11 answers

Python AWS Boto3: How do i read files from S3 Bucket?

All of the answers are kind of ...READ MORE

answered Mar 30 in AWS by awsdbaexpert
• 200 points
34,742 views
0 votes
1 answer

Import my AWS credentials using python script

Using AWS Cli  Configure your IAM user then ...READ MORE

answered Nov 16, 2018 in AWS by Jino
• 5,560 points
315 views
0 votes
1 answer

AWS S3 uploading hidden files by default

versioning is enabled in your bucket. docs.aws.amazon.com/AmazonS3/latest/user-guide/….... the ...READ MORE

answered Oct 4, 2018 in AWS by Priyaj
• 56,540 points
231 views
0 votes
1 answer

Pass account id of an AWS sub account using a variable as an argument in CloudWatch Alarm Actions with python (boto3)?

Python String and Integer concatenation >>> print("arn:aws:swf:us-east-2:{0}:action/actions/AWS_EC2.InstanceId.Stop/1.0".format(acccnum)) arn:aws:swf:us-east-2:12312312312312:action/actions/AWS_EC2.InstanceId.Stop/1.0 >>> print("arn:aws:swf:us-east-2:" ...READ MORE

answered Oct 5, 2018 in AWS by Priyaj
• 56,540 points
128 views
0 votes
1 answer

How to decrypt the encrypted S3 file using aws-encryption-cli --decrypt

Use command : aws s3 presign s3://mybucket/abc_count.png you get ...READ MORE

answered Oct 22, 2018 in AWS by Priyaj
• 56,540 points
459 views
0 votes
1 answer

Boto3 - python script to view all directories and files

There are no folders, only S3 object ...READ MORE

answered Sep 14, 2018 in AWS by Priyaj
• 56,540 points
8,549 views
0 votes
1 answer

what php API can be used to upload and download files to Amazon S3

Amazon have a PHPSDK , check the sample code // ...READ MORE

answered Oct 17, 2018 in AWS by Priyaj
• 56,540 points
824 views