What is the HDFS command to list all the files in HDFS according to the timestamp

0 votes
What is the command to list the directories in HDFS as per timestamp? I tried hdfs dfs ls -l which provides the list of directories with their respective permissions. I tried a workaround with hdfs -dfs -ls /tmp | sort -k6,7. Is there an inbuilt hdfs command for this?
Apr 10, 2018 in Big Data Hadoop by kurt_cobain
• 9,350 points
74,188 views

5 answers to this question.

0 votes

The following arguments are available with hadoop ls command:

Usage: hadoop fs -ls [-d] [-h] [-R] [-t] [-S] [-r] [-u] <args>

Options:

-d: Directories are listed as plain files.
-h: Format file sizes in a human-readable fashion (eg 64.0m instead of 67108864).
-R: Recursively list subdirectories encountered.
-t: Sort output by modification time (most recent first).
-S: Sort output by file size.
-r: Reverse the sort order.
-u: Use access time rather than modification time for display and sorting.

You can sort the files using following command:

hdfs dfs -ls -t -R (-r) /tmp 
For details, You can even check out Hadoop Ecosystem tools with the Training for big data.
answered Apr 10, 2018 by Shubham
• 13,490 points
0 votes
ls -R -t (-r) should do the job
answered Dec 7, 2018 by Shravan
0 votes

hdfs dfs -ls /test | sort -k6,7 is simple and easy. I dont see a reason why you want to use something else.

answered Dec 7, 2018 by Pachit
0 votes
hdfs dfs -ls -t1
answered Dec 7, 2018 by Pratam
This does not list files by timestamp. It only lists files by time
0 votes

You can try filter using value in ls.

ls -ltr | awk '$6 == "<Month of the year>" && $7 == <day of the month> {print $9}
answered Dec 7, 2018 by Lenny

Related Questions In Big Data Hadoop

0 votes
1 answer

What is the standard way to create files in your hdfs file-system?

Well, it's so easy. Just enter the below ...READ MORE

answered Sep 23, 2018 in Big Data Hadoop by Frankie
• 9,830 points
2,611 views
0 votes
1 answer

What is the command to count number of lines in a file in hdfs?

hadoop fs -cat /example2/doc1 | wc -l READ MORE

answered Nov 22, 2018 in Big Data Hadoop by Omkar
• 69,220 points
4,072 views
0 votes
11 answers
0 votes
1 answer

What is the command to check the number of cores in Spark?

Go to your Spark Web UI & ...READ MORE

answered May 17, 2018 in Big Data Hadoop by Shubham
• 13,490 points
7,053 views
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
11,072 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
109,058 views
–1 vote
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,350 points
4,639 views
0 votes
1 answer
0 votes
1 answer

What is the command to navigate in HDFS?

First of all there is no command ...READ MORE

answered Apr 27, 2018 in Big Data Hadoop by Shubham
• 13,490 points
5,706 views
0 votes
1 answer

What is the command to find the free space in HDFS?

You can use dfsadmin which runs a ...READ MORE

answered Apr 29, 2018 in Big Data Hadoop by Shubham
• 13,490 points
2,228 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP