Access DFS through Java API

0 votes

I m trying to access DFS through Java API so for that I created one Configuration object by using 

Configuration conf = new Configuration(); 

It is not working and not able to find DFS. I've tried adding core-site.xml,mapred-site.xml,yarn-site.xml,and hdfs-site.xml to the Configuration as resources, but it doesn't change anything. What should I do now? 

Thanks in advance.

Mar 23, 2018 in Big Data Hadoop by Ashish
• 2,650 points
1,858 views

1 answer to this question.

0 votes

It is because, may be you are not adding the core-site.xml, hdfs-site.xml correctly to the configuration. 

And no need to add mapred-site.xml and yarn-site.xml. 

Try the below code, hope it will work 

Configuration config = new Configuration(); 

config.addResource(new Path("file:///etc/hadoop/conf/core-site.xml")); // Put your actual core-site.xml path 

config.addResource(new Path("file:///etc/hadoop/conf/hdfs-site.xml")); // Put your actual hdfs-site.xml path 

Path p = new Path("."); // put HDFS path 

FileSystem fs1 = p.getFileSystem(config); 

System.out.println("Home directory :"+fs1.getHomeDirectory()); 

Alternatively you can also try: 

Instead of adding configuration files using addResource method, use set method. Open your core-site.xml file and find the value of fs.defaultFS. Use set method instead of addResource method. config.set("fs1.defaultFS","hdfs://<Namenode-Host>:<Port>"); // Refer your core-site.xml file and replace <Namenode-Host> and <Port> with your cluster namenode and Port (default port number should be `8020`).

answered Mar 23, 2018 by nitinrawat895
• 11,380 points

Related Questions In Big Data Hadoop

0 votes
1 answer

Moving files in Hadoop using the Java API?

I would recommend you to use FileSystem.rename(). ...READ MORE

answered Apr 15, 2018 in Big Data Hadoop by Shubham
• 13,490 points
2,638 views
0 votes
1 answer

Hadoop HDFS: Java API to move files to hdfs

You can use the FileUtil api to do this. Example: Configuration ...READ MORE

answered Nov 19, 2018 in Big Data Hadoop by Omkar
• 69,230 points
4,059 views
–1 vote
1 answer

How to access Hadoop counter values using API?

You can use the job object to access the ...READ MORE

answered Dec 31, 2018 in Big Data Hadoop by Omkar
• 69,230 points
1,121 views
–1 vote
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,390 points
4,550 views
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
10,930 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
108,339 views
0 votes
1 answer
0 votes
1 answer

Hadoop security GroupMappingServiceProvider exception for Spark job via Dataproc API

One of the reason behin you getting ...READ MORE

answered Mar 23, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
889 views
0 votes
1 answer

Is there any way to increase Java Heap size in Hadoop?

You can add some more memory by ...READ MORE

answered Apr 12, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
4,870 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP