Spark Scala: How to list all folders in directory

0 votes

I can do this by using the command in Hadoop: 

hadoop fs -ls hdfs://sandbox.hortonworks.com/demo/

I want to do the same with Scale or Spark. Can someone tell me how to do it?

Oct 31, 2018 in Big Data Hadoop by digger
• 26,550 points
2,129 views

3 answers to this question.

0 votes

This should work

val fs = FileSystem.get(new Configuration())
val status = fs.listStatus(new Path(YOUR_HDFS_PATH))
status.foreach(x=> println(x.getPath))
answered Oct 31, 2018 by Omkar
• 67,600 points
0 votes
val listStatus = org.apache.hadoop.fs.FileSystem.get(new URI(url), sc.hadoopConfiguration)
.globStatus(new org.apache.hadoop.fs.Path(url))

  for (urlStatus <- listStatus) {
    println("urlStatus get Path:" + urlStatus.getPath())

}
answered Dec 4, 2018 by Ramesh
0 votes
val spark = SparkSession.builder().appName("Demo").getOrCreate()
val path = new Path("enter your directory path")
val fs:FileSystem = projects.getFileSystem(spark.sparkContext.hadoopConfiguration)
val it = fs.listLocatedStatus(path)
answered Dec 4, 2018 by Mark

Related Questions In Big Data Hadoop

0 votes
5 answers
0 votes
1 answer

How to groupBy/count then filter on count in Scala

I think the exception is caused because ...READ MORE

answered Apr 19, 2018 in Big Data Hadoop by kurt_cobain
• 9,260 points
7,800 views
0 votes
1 answer

How can we list files in HDFS directory as per timestamp?

No, there is no other option to ...READ MORE

answered May 8, 2018 in Big Data Hadoop by nitinrawat895
• 10,690 points
928 views
0 votes
1 answer
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,690 points
3,033 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,690 points
339 views
0 votes
10 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyFr ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
14,979 views
0 votes
1 answer

How to handle exceptions in Spark and Scala?

There is no particular format to handle ...READ MORE

answered Jan 21 in Big Data Hadoop by Omkar
• 67,600 points
214 views
0 votes
5 answers

Hadoop hdfs: list all files in a directory and its subdirectories

Hi, You can try this command: hadoop fs -ls ...READ MORE

answered Aug 1 in Big Data Hadoop by Dinish
1,933 views