Spark Scala: How to list all folders in directory

0 votes

I can do this by using the command in Hadoop: 

hadoop fs -ls hdfs://sandbox.hortonworks.com/demo/

I want to do the same with Scale or Spark. Can someone tell me how to do it?

Oct 31, 2018 in Big Data Hadoop by digger
• 27,620 points
1,118 views

3 answers to this question.

0 votes

This should work

val fs = FileSystem.get(new Configuration())
val status = fs.listStatus(new Path(YOUR_HDFS_PATH))
status.foreach(x=> println(x.getPath))
answered Oct 31, 2018 by Omkar
• 67,120 points
0 votes
val listStatus = org.apache.hadoop.fs.FileSystem.get(new URI(url), sc.hadoopConfiguration)
.globStatus(new org.apache.hadoop.fs.Path(url))

  for (urlStatus <- listStatus) {
    println("urlStatus get Path:" + urlStatus.getPath())

}
answered Dec 4, 2018 by Ramesh
0 votes
val spark = SparkSession.builder().appName("Demo").getOrCreate()
val path = new Path("enter your directory path")
val fs:FileSystem = projects.getFileSystem(spark.sparkContext.hadoopConfiguration)
val it = fs.listLocatedStatus(path)
answered Dec 4, 2018 by Mark

Related Questions In Big Data Hadoop

0 votes
5 answers
0 votes
1 answer

How to groupBy/count then filter on count in Scala

I think the exception is caused because ...READ MORE

answered Apr 19, 2018 in Big Data Hadoop by kurt_cobain
• 9,240 points
5,034 views
0 votes
1 answer

How can we list files in HDFS directory as per timestamp?

No, there is no other option to ...READ MORE

answered May 8, 2018 in Big Data Hadoop by nitinrawat895
• 10,030 points
536 views
0 votes
0 answers
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,030 points
2,037 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,030 points
192 views
0 votes
10 answers

hadoop fs -put command?

copy command can be used to copy files ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Sujay
10,345 views
0 votes
1 answer

How to handle exceptions in Spark and Scala?

There is no particular format to handle ...READ MORE

answered Jan 21 in Big Data Hadoop by Omkar
• 67,120 points
66 views
0 votes
3 answers

Hadoop hdfs: list all files in a directory and its subdirectories

You can do it using queue: private static ...READ MORE

answered Dec 4, 2018 in Big Data Hadoop by Ishwar
1,203 views