AWS S3 - Listing all objects inside a folder without the prefix

+2 votes

I'm having problems retrieving all objects(filenames) inside a folder in AWS S3. Here's my code:

ListObjectsRequest listObjectsRequest = new ListObjectsRequest()
            .withPrefix(folderName + "/")
            .withMarker(folderName + "/")

    ObjectListing objectListing = amazonWebService.s3.listObjects(listObjectsRequest)

    for (S3ObjectSummary summary : objectListing.getObjectSummaries()) {
        print summary.getKey()

It returns the correct object but with the prefix in it, e.g. foldename/filename

I know I can just use java perhaps substring to exclude the prefix but I just wanted to know if there is a method for it in AWS SDK.

Sep 20, 2018 in AWS by bug_seeker
• 15,520 points

Check this out for a detailed walkthrough on AWS S3.

3 answers to this question.

+2 votes

For Scala developers, here it is recursive function to execute a full scan and map of the contents of an AmazonS3 bucket using the official AWS SDK for Java

import{S3ObjectSummary, ObjectListing, GetObjectRequest}
import scala.collection.JavaConversions.{collectionAsScalaIterable => asScala}

def map[T](s3: AmazonS3Client, bucket: String, prefix: String)(f: (S3ObjectSummary) => T) = {

  def scan(acc:List[T], listing:ObjectListing): List[T] = {
    val summaries = asScala[S3ObjectSummary](listing.getObjectSummaries())
    val mapped = (for (summary <- summaries) yield f(summary)).toList

    if (!listing.isTruncated) mapped.toList
    else scan(acc ::: mapped, s3.listNextBatchOfObjects(listing))

  scan(List(), s3.listObjects(bucket, prefix))

To invoke the above curried map() function, simply pass the already constructed (and properly initialized) AmazonS3Client object (refer to the official AWS SDK for Java API Reference), the bucket name and the prefix name in the first parameter list. Also pass the function f() you want to apply to map each object summary in the second parameter list.

For example

map(s3, bucket, prefix) { s => println(s.getKey.split("/")(1)) }

will print all the filenames (without the prefix)

val tuple = map(s3, bucket, prefix)(s => (s.getKey, s.getOwner, s.getSize))

will return the full list of (key, owner, size) tuples in that bucket/prefix

val totalSize = map(s3, "bucket", "prefix")(s => s.getSize).sum

will return the total size of its content (note the additional sum() folding function applied at the end of the expression. A better understanding can be acquired through the AWS Training and Certification.

answered Sep 20, 2018 by Priyaj
• 58,090 points
+1 vote

There is not. Linked is a list of all the methods that are available. The reason behind this is the S3 design. S3 does not have "subfolders". Instead it is simply a list of files, where the filename is the "prefix" plus the filename you desire. The GUI shows the data similar to windows stored in "folders", but there is not folder logic present in S3.

Your best bet is to split by "/" and to take the last object in the array.

Hope this helps!

Check out AWS SysOps administrator certification here to learn more.


answered May 21, 2019 by anonymous

edited Jul 10, 2023 by Khan Sarfaraz
But aren't files in s3 stored with foldername followed by the filename? I was under this impression, please do correct me if I am wrong. Thank you
0 votes


I think you can use the AWS CLI command for your requirement. The following example uses the list-objects command to display the names of all the objects in the specified bucket.

$ aws s3api list-objects --bucket text-content --query 'Contents[].{Key: Key, Size: Size}'
answered Dec 16, 2020 by MD
• 95,440 points

Related Questions In AWS

0 votes
1 answer
0 votes
1 answer

Is it possible to find all S3 buckets given a prefix?

The high level collection command s3.buckets.filter only ...READ MORE

answered Aug 27, 2018 in AWS by Archana
• 4,170 points
0 votes
1 answer

How to download the latest file in a S3 bucket using AWS CLI?

You can use the below command $ aws ...READ MORE

answered Sep 6, 2018 in AWS by Archana
• 4,170 points
0 votes
1 answer

AWS S3 uploading hidden files by default

versioning is enabled in your bucket.….... the ...READ MORE

answered Oct 4, 2018 in AWS by Priyaj
• 58,090 points
–1 vote
1 answer

How to decrypt the encrypted S3 file using aws-encryption-cli --decrypt

Use command : aws s3 presign s3://mybucket/abc_count.png you get ...READ MORE

answered Oct 22, 2018 in AWS by Priyaj
• 58,090 points
0 votes
1 answer

Import my AWS credentials using python script

Using AWS Cli  Configure your IAM user then ...READ MORE

answered Nov 16, 2018 in AWS by Jino
• 5,820 points
0 votes
2 answers
0 votes
1 answer
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP