Datanode process not running in Hadoop

+2 votes

I have set up and configured a multi-node Hadoop cluster in my system

Now i try to start all daemons by running start-all.sh command, it shows all the processes initializing properly as follows:

starting namenode, logging to /usr/local/hadoop/libexec/../logs/hadoop1-root-namenode-hadoop.out
hadoop1: starting datanode, logging to /usr/local/hadoop/libexec/../logs/hadoop-root-datanode-hadoop1.out
hadoop2: starting datanode, logging to /usr/local/hadoop/libexec/../logs/hadoop-root-datanode-hadoop2.out
hadoop1: starting secondarynamenode, logging to /usr/local/hadoop/libexec/../logs/hadoop-root-secondarynamenode-hadoop1.out
starting jobtracker, logging to /usr/local/hadoop/libexec/../logs/hadoop-root-jobtracker-hadoop1.out
hadoop1: starting tasktracker, logging to /usr/local/hadoop/libexec/../logs/hadoop-root-tasktracker-hadoop1.out
hadoop2: starting tasktracker, logging to /usr/local/hadoop/libexec/../logs/hadoop-root-tasktracker-hadoop2.out

However, when I type the jps command, I get the following output:

31057 NameNode
4001 RunJar
6182 RunJar
31328 SecondaryNameNode
31411 JobTracker
32119 Jps
31560 TaskTracker

Can someone tell me the solution for this?

Oct 22, 2018 in Big Data Hadoop by slayer
• 29,040 points

recategorized Oct 25, 2018 by Vardhan 90 views

4 answers to this question.

Your answer

Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
+1 vote

You need to do something like this:

  • bin/stop-all.sh (or stop-dfs.sh and stop-yarn.sh in the 2.x serie)
  • rm -Rf /app/tmp/hadoop-your-username/*
  • bin/hadoop namenode -format
answered Oct 22, 2018 by Omkar
• 65,820 points
+1 vote
Try to start the each process manually.
answered Oct 25, 2018 by Satish
Can you mention how to do that?

Yes. These are the command to manually start them:

./Stop-all.sh
./hadoop-daemon.sh start namenode
./hadoop-daemon.sh start datanode
./yarn-daemon.sh start resourcemanager
./yarn-daemon.sh start nodemanager
./mr-jobhistory-daemon.sh start historyserver

Hope this helped ..

This worked for me.. Thanks
+1 vote

First stop the process by running

Stop-all.sh

Then go to the path where you have installed Hadoop. In that, go to the bin folder and run

hadoop datanode

This should work

answered Oct 25, 2018 by Kiran
+1 vote

Run the following commands:

Stop-all.sh 
start-dfs.sh 
start-yarn.sh 
mr-jobhistory-daemon.sh start historyserver
answered Oct 25, 2018 by Anand

Related Questions In Big Data Hadoop

0 votes
1 answer

Not able to start Job History Server in Hadoop 2.8.1

You have to start JobHistoryServer process specifically ...READ MORE

answered Mar 29, 2018 in Big Data Hadoop by Ashish
• 2,630 points
227 views
0 votes
0 answers

Error running hadoop mapreduce in Python using Hadoop Streaming

I was trying a sample mapredyce code ...READ MORE

Apr 2, 2018 in Big Data Hadoop by nitinrawat895
• 9,030 points
52 views
0 votes
3 answers

Cloudera Hadoop - Daemons not running

Please run below mentioned command. It will ...READ MORE

answered Aug 7, 2018 in Big Data Hadoop by Priyaj
• 56,100 points
422 views
0 votes
1 answer

Is there a way to rebalance single Datanode in Hadoop.

Currently Hadoop does not automatically do this. ...READ MORE

answered Apr 15, 2018 in Big Data Hadoop by kurt_cobain
• 9,260 points
36 views
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 9,030 points
1,640 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 9,030 points
130 views
0 votes
10 answers

hadoop fs -put command?

copy command can be used to copy files ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Sujay
7,955 views
0 votes
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,260 points
552 views
+1 vote
1 answer

Not able to start datanode in Hadoop

Follow these steps: Stop namenode Delete the datanode directory ...READ MORE

answered Dec 12, 2018 in Big Data Hadoop by Omkar
• 65,820 points
48 views
0 votes
1 answer

Hadoop cluster is not running in vm

First check if all daemons are running: sudo ...READ MORE

answered Dec 26, 2018 in Big Data Hadoop by Omkar
• 65,820 points
12 views

© 2018 Brain4ce Education Solutions Pvt. Ltd. All rights Reserved.
"PMP®","PMI®", "PMI-ACP®" and "PMBOK®" are registered marks of the Project Management Institute, Inc. MongoDB®, Mongo and the leaf logo are the registered trademarks of MongoDB, Inc.