NodeManager is not able to connect to the ResourceManager

0 votes

I have a up and running Hadoop cluster which has 4 nodes. But whenever I am executing a MapReduce program, I only see 1 active node from which I am submitting the job. But when I see my resource manager’s the error log, I see this:

yarn-deploy-nodemanager-master.log:
INFO org.apache.hadoop.yarn.client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032

Where can I find this property which allocates ResourceManager the address 0.0.0.0:8032? So, can anyone help me out in understanding the error?

May 1, 2018 in Big Data Hadoop by coldcode
• 2,070 points
1,764 views

1 answer to this question.

0 votes
Here, I guess the issue is with the firewall. The ResourceManager port to which the NodeManagers are trying to connect is blocked. Firewall may be blocking resource manager’s port. You can turn off the firewall. But, I would recommend you to only open the 8032 port of the ResouceManager.
answered May 1, 2018 by Shubham
• 13,480 points

Related Questions In Big Data Hadoop

0 votes
1 answer

How to write examples to test whether the installation is validated or not?

You have some example jars in your ...READ MORE

answered Sep 5, 2018 in Big Data Hadoop by Frankie
• 9,810 points
130 views
0 votes
1 answer

Hbase: Client not able to connect with remote Hbase server

You have to remove the localhost entry from hbase server's ...READ MORE

answered Nov 8, 2018 in Big Data Hadoop by Omkar
• 69,150 points
3,451 views
0 votes
1 answer

How to check the directory is present in shell script or not?

Hi, To check the directory is present in ...READ MORE

answered Jun 24, 2019 in Big Data Hadoop by Gitika
• 65,930 points
149 views
0 votes
2 answers

Not Able to read the file from hdfs location

Please make sure you connect to spark2-shell ...READ MORE

answered Jul 14, 2020 in Big Data Hadoop by Shantanu
• 190 points
441 views
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
7,599 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
57,898 views
–1 vote
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,390 points
2,862 views
0 votes
1 answer
0 votes
1 answer

Why am I not able to see the Hadoop daemons that are running?

I guess you are starting the services ...READ MORE

answered Apr 18, 2018 in Big Data Hadoop by Shubham
• 13,480 points
258 views
0 votes
2 answers

Hadoop is not able to find jps command

jps is actually not a command of ...READ MORE

answered Apr 18, 2018 in Big Data Hadoop by Shubham
• 13,480 points
7,055 views