Hadoop is not able to find jps command

0 votes

I have started all the Hadoop daemon. Now I am using jps command to check whether all the daemons are running or not. But, I am getting the below error:

  hduser@naveen:/usr/local/hadoop$ jps
  The program 'jps' can be found in the following packages:
  * openjdk-6-jdk
  * openjdk-7-jdk
Try: sudo apt-get install <selected package>

My java version is as follows:

hduser@naveen:/usr/local/hadoop$ java -version
java version "1.6.0_33"
Java(TM) SE Runtime Environment (build 1.6.0_33-b04)
Java HotSpot(TM) 64-Bit Server VM (build 20.8-b03, mixed mode)


JAVA_HOME in conf/hadoop-env.sh is as follows:

# set JAVA_HOME in this file, so that it is correctly defined on
export JAVA_HOME=/usr/lib/jvm/jdk1.6.0_33/

Can anyone help me out?

Apr 18, 2018 in Big Data Hadoop by nitinrawat895
• 9,310 points
1,267 views

2 answers to this question.

Your answer

Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
+1 vote

jps is actually not a command of Hadoop. It is a Java command which actually checks the java processes that are running.

The jps command is found in the $JAVA_HOME/bin directory. I would recommend you to set an alias for the jps command:

alias jps='/usr/lib/jvm/jdk1.7.0_33/bin/jps'

I hope this will help you.

answered Apr 18, 2018 by Shubham
• 12,710 points

Run command

sudo yum install ant

Could you please be more clear why ant package is required for jps command? If I am not wrong openjdk and java packages has to be configured in Hadoop environment. May I know if you are referring to ant package for JSP?

0 votes

Hi try this method:

First update the jdk or java using ant with below command

$ sudo yum install ant

Next

# set JAVA_HOME in this file, so that it is correctly defined on
export JAVA_HOME=/usr/lib/jvm/jdk(version you found)/

Next 

# set alias for the path using 
alias jps='/usr/lib/jvm/jdk(version you found)/bin/jps'

Next 

# Type in command again
sudo jps

It will list all the Hadoop clusters. I hope it will help you solve that error. 

answered Mar 21 by Pruthvi

edited Mar 21 by Vardhan
Hello @Pruthvi. I have installed ant but that didn't update Java and jdk. Can you please tell what command to use to update it?

Related Questions In Big Data Hadoop

0 votes
1 answer

Not able to start Job History Server in Hadoop 2.8.1

You have to start JobHistoryServer process specifically ...READ MORE

answered Mar 29, 2018 in Big Data Hadoop by Ashish
• 2,630 points
266 views
0 votes
1 answer

Not able to find HDFS directory on my system?

I guess you didn't understand it correctly dfs.datanode.data.dir, ...READ MORE

answered Apr 17, 2018 in Big Data Hadoop by nitinrawat895
• 9,310 points
118 views
0 votes
1 answer

Moving files in Hadoop using the Java API?

I would recommend you to use FileSystem.rename(). ...READ MORE

answered Apr 15, 2018 in Big Data Hadoop by Shubham
• 12,710 points
593 views
0 votes
1 answer

Hadoop giving java.io.IOException, in mkdir Java code.

I am not sure about the issue. ...READ MORE

answered May 3, 2018 in Big Data Hadoop by Shubham
• 12,710 points
288 views
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 9,310 points
1,820 views
0 votes
10 answers

hadoop fs -put command?

copy command can be used to copy files ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Sujay
9,004 views
0 votes
1 answer

start-dfs.sh command is not starting Hadoop JobTracker & TaskTracker

start-dfs.sh script, only starts the HDFS daemons. ...READ MORE

answered Apr 15, 2018 in Big Data Hadoop by Shubham
• 12,710 points
473 views
0 votes
10 answers

© 2018 Brain4ce Education Solutions Pvt. Ltd. All rights Reserved.
"PMP®","PMI®", "PMI-ACP®" and "PMBOK®" are registered marks of the Project Management Institute, Inc. MongoDB®, Mongo and the leaf logo are the registered trademarks of MongoDB, Inc.