Format HDFS Namenode Error: Could not find or load main class ”-Djava.library.path=.home.hadoop.hadoop-3.2.1.lib.native”

0 votes

I am building a single node HDFS on Ubuntu 18.04 and am getting the following error when I try to format the HDFS name node using the command:

hdfs namenode -format

Error: Could not find or load main class ”-Djava.library.path=.home.hadoop.hadoop-3.2.1.lib.native”

I have the following configurations files:

.bashrc

export HADOOP_HOME=/home/hadoop/hadoop-3.2.1
export HADOOP_INSTALL=$HADOOP_HOME
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin
export HADOOP_OPTS=”-Djava.library.path=$HADOOP_HOME/lib/native”
hadoop-env.sh
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
core-site.xml
<configuration>
<property>
  <name>hadoop.tmp.dir</name>
  <value>/home/hadoop/tmpdata</value>
</property>
<property>
  <name>fs.default.name</name>
  <value>hdfs://127.0.0.1:9000</value>
</property>
</configuration>
hdfs-site.xml
<configuration>
<property>
  <name>dfs.data.dir</name>
  <value>/home/hadoop/dfsdata/namenode</value>
</property>
<property>
  <name>dfs.data.dir</name>
  <value>/home/hadoop/dfsdata/datanode</value>
</property>
<property>
  <name>dfs.replication</name>
  <value>1</value>
</property>
</configuration>
mapred-site.xml
<configuration> 
<property> 
  <name>mapreduce.framework.name</name> 
  <value>yarn</value> 
</property> 
</configuration>
yarn-site.xml
<configuration>
<property>
  <name>yarn.nodemanager.aux-services</name>
  <value>mapreduce_shuffle</value>
</property>
<property>
  <name>yarn.nodemanager.aux-services.mapreduce.shuffle.class</name>
  <value>org.apache.hadoop.mapred.ShuffleHandler</value>
</property>
<property>
  <name>yarn.resourcemanager.hostname</name>
  <value>127.0.0.1</value>
</property>
<property>
  <name>yarn.acl.enable</name>
  <value>0</value>
</property>
<property>
  <name>yarn.nodemanager.env-whitelist</name>   
  <value>JAVA_HOME,HADOOP_COMMON_HOME,HADOOP_HDFS_HOME,HADOOP_CONF_DIR,CLASSPATH_PERPEND_DISTCACHE,HADOOP_YARN_HOME,HADOOP_MAPRED_HOME</value>
</property>
</configuration>

I have created the namenode and datanode folders referenced in hdfs-site.xml.  I have been chasing this for several hours without a solutions.  Thanks for suggested fixes.

Jun 12 in Big Data Hadoop by fwood
• 120 points

edited Jun 12 by MD 690 views

1 answer to this question.

0 votes

Hi@fwood,

According to your configuration, you didn't set JAVA_HOME & PATH. Try to set these variables in .bashrc file. You can use the below-given command to find your JAVA_HOME & PATH.

$ which java

Also, change the value of HADOOP_OPTS as given below.

export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib/"

In your hadoop-env.sh file export the value of HADOOP_CONF_DIR. I suggest you don't change the variable name. You change the variable name of HADOOP_CONF_DIR to HADOOP_INSTALL. Hadoop internally searches these variables and if it finds different values, it may give an error.

answered Jun 12 by MD
• 40,740 points
Thank You MD:  That did the trick, I have Hadoop up and running with yarn.  For those looking at this thread later Here are my updated files:

.bashrc

export HADOOP_HOME=/home/hadoop/hadoop-3.2.1
export HADOOP_INSTALL=$HADOOP_HOME
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib/"
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64

hadoop-env.sh

export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64

export HADOOP_CONF_DIR=${HADOOP_HOME}/etc/hadoop

Now that I have hadoop up and running I plan to install ZooKeeper and Accumulo.  The overall project goal is to build a basic docker container app and get it to talk to data in the Hadoop store through the Accumulo security structure.  Thanks Again for the help.

Related Questions In Big Data Hadoop

0 votes
1 answer

Hadoop Mapreduce: Error: Could not find or load main class com.sun.tools.javac.Main

You have to add HADOOP_CLASSPATH environment parameter: expor ...READ MORE

answered Oct 30, 2018 in Big Data Hadoop by Omkar
• 69,040 points
1,036 views
0 votes
1 answer

Hadoop MacOS: formatting namenode: Could not find or load main class

You need to point the HADOOP_PREFIX to ...READ MORE

answered Nov 13, 2018 in Big Data Hadoop by Omkar
• 69,040 points
282 views
0 votes
1 answer

Hadoop: Error: Could not find or load main class org.apache.hadoop.util.VersionInfo

This seems like a path issue. Add the ...READ MORE

answered Nov 16, 2018 in Big Data Hadoop by Omkar
• 69,040 points
5,421 views
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,920 points
5,441 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,920 points
800 views
+1 vote
11 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyF ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
33,640 views
–1 vote
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,310 points
2,051 views
0 votes
3 answers
0 votes
1 answer

How do you find an HDFS path URL?

Hi@akhtar, The Hadoop configuration file is default located ...READ MORE

answered Feb 28 in Big Data Hadoop by MD
• 40,740 points
705 views