Access DFS through Java API

0 votes

I m trying to access DFS through Java API so for that I created one Configuration object by using 

Configuration conf = new Configuration(); 

It is not working and not able to find DFS. I've tried adding core-site.xml,mapred-site.xml,yarn-site.xml,and hdfs-site.xml to the Configuration as resources, but it doesn't change anything. What should I do now? 

Thanks in advance.

Mar 22, 2018 in Big Data Hadoop by Ashish
• 2,630 points
246 views

1 answer to this question.

0 votes

It is because, may be you are not adding the core-site.xml, hdfs-site.xml correctly to the configuration. 

And no need to add mapred-site.xml and yarn-site.xml. 

Try the below code, hope it will work 

Configuration config = new Configuration(); 

config.addResource(new Path("file:///etc/hadoop/conf/core-site.xml")); // Put your actual core-site.xml path 

config.addResource(new Path("file:///etc/hadoop/conf/hdfs-site.xml")); // Put your actual hdfs-site.xml path 

Path p = new Path("."); // put HDFS path 

FileSystem fs1 = p.getFileSystem(config); 

System.out.println("Home directory :"+fs1.getHomeDirectory()); 

Alternatively you can also try: 

Instead of adding configuration files using addResource method, use set method. Open your core-site.xml file and find the value of fs.defaultFS. Use set method instead of addResource method. config.set("fs1.defaultFS","hdfs://<Namenode-Host>:<Port>"); // Refer your core-site.xml file and replace <Namenode-Host> and <Port> with your cluster namenode and Port (default port number should be `8020`).

answered Mar 22, 2018 by nitinrawat895
• 10,510 points

Related Questions In Big Data Hadoop

0 votes
1 answer

Moving files in Hadoop using the Java API?

I would recommend you to use FileSystem.rename(). ...READ MORE

answered Apr 15, 2018 in Big Data Hadoop by Shubham
• 13,290 points
763 views
0 votes
1 answer

Hadoop HDFS: Java API to move files to hdfs

You can use the FileUtil api to do this. Example: Configuration ...READ MORE

answered Nov 19, 2018 in Big Data Hadoop by Omkar
• 67,290 points
74 views
0 votes
1 answer

How to access Hadoop counter values using API?

You can use the job object to access the ...READ MORE

answered Dec 31, 2018 in Big Data Hadoop by Omkar
• 67,290 points
57 views
0 votes
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,240 points
896 views
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,510 points
2,392 views
0 votes
10 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyFr ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
12,186 views
0 votes
1 answer
0 votes
1 answer

Hadoop security GroupMappingServiceProvider exception for Spark job via Dataproc API

One of the reason behin you getting ...READ MORE

answered Mar 22, 2018 in Big Data Hadoop by nitinrawat895
• 10,510 points
115 views
0 votes
1 answer

Is there any way to increase Java Heap size in Hadoop?

You can add some more memory by ...READ MORE

answered Apr 12, 2018 in Big Data Hadoop by nitinrawat895
• 10,510 points
859 views