NodeManager is not able to connect to the ResourceManager.

0 votes

I have a up and running Hadoop cluster which has 4 nodes. But whenever I am executing a MapReduce program, I only see 1 active node from which I am submitting the job. But when I see my resource manager’s the error log, I see this:

yarn-deploy-nodemanager-master.log:
INFO org.apache.hadoop.yarn.client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032

Where can I find this property which allocates ResourceManager the address 0.0.0.0:8032? So, can anyone help me out in understanding the error?

May 1, 2018 in Big Data Hadoop by coldcode
• 1,980 points
400 views

1 answer to this question.

Your answer

Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
0 votes
Here, I guess the issue is with the firewall. The ResourceManager port to which the NodeManagers are trying to connect is blocked. Firewall may be blocking resource manager’s port. You can turn off the firewall. But, I would recommend you to only open the 8032 port of the ResouceManager.
answered May 1, 2018 by Shubham
• 12,110 points

Related Questions In Big Data Hadoop

0 votes
1 answer

How to write examples to test whether the installation is validated or not?

You have some example jars in your ...READ MORE

answered Sep 5, 2018 in Big Data Hadoop by Frankie
• 9,570 points
18 views
0 votes
1 answer

Hbase: Client not able to connect with remote Hbase server

You have to remove the localhost entry from hbase server's ...READ MORE

answered Nov 8, 2018 in Big Data Hadoop by Omkar
• 65,820 points
230 views
0 votes
1 answer

Not able to start Job History Server in Hadoop 2.8.1

You have to start JobHistoryServer process specifically ...READ MORE

answered Mar 29, 2018 in Big Data Hadoop by Ashish
• 2,630 points
227 views
0 votes
5 answers
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 9,030 points
1,636 views
0 votes
10 answers

hadoop fs -put command?

copy command can be used to copy files ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Sujay
7,943 views
0 votes
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,260 points
551 views
0 votes
1 answer
0 votes
1 answer

Why am I not able to see the Hadoop daemons that are running?

I guess you are starting the services ...READ MORE

answered Apr 18, 2018 in Big Data Hadoop by Shubham
• 12,110 points
38 views
0 votes
2 answers

Hadoop is not able to find jps command

jps is actually not a command of ...READ MORE

answered Apr 18, 2018 in Big Data Hadoop by Shubham
• 12,110 points
1,120 views

© 2018 Brain4ce Education Solutions Pvt. Ltd. All rights Reserved.
"PMP®","PMI®", "PMI-ACP®" and "PMBOK®" are registered marks of the Project Management Institute, Inc. MongoDB®, Mongo and the leaf logo are the registered trademarks of MongoDB, Inc.