Hadoop “Unable to load native-hadoop library for your platform” warning

+10 votes

I installed Hadoop on a server running centos. Whenever I run start-dfs.sh or stop-dfs.sh, I get the following error:

WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

My Hadoop version is - Hadoop 2.2.0.

Can anyone tell me why I m getting this warning, I am not sure what to do.

I've also added these two environment variables in hadoop-env.sh:

export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=/usr/local/hadoop/lib/"

export HADOOP_COMMON_LIB_NATIVE_DIR="/usr/local/hadoop/lib/native/"

Any suggestions?

Mar 21, 2018 in Big Data Hadoop by coldcode
• 1,980 points
6,870 views

10 answers to this question.

Your answer

Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
+2 votes

There may be two possibilities:

  1. You have installed the wrong Java JDK8 package. Please ensure to download the 64-bit JDK8 and remove your current 32-bit JDK8.
  2. It's a warning due to the Hadoop libraries being compiled for 32bits. You are probably running on a 64bit OS.

It's safe to ignore this warning.

answered Mar 22, 2018 by nitinrawat895
• 9,310 points
+2 votes

I had the same issue. It's solved by adding following lines in .bashrc:

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
answered Sep 10, 2018 by Neeraj
its not working
Hi! Did you try the solutions mentioned in other answers?
+2 votes

In my case , after I build hadoop on my 64 bit Linux mint OS, I replaced the native library in hadoop/lib. Still the problem persist. Then I figured out the hadoop pointing to hadoop/lib not to the hadoop/lib/native. So I just moved all content from native library to its parent. And the warning just gone.

answered Sep 10, 2018 by Ram
+2 votes

This also would work:

export LD_LIBRARY_PATH=/usr/lib/hadoop/lib/native
answered Sep 10, 2018 by Gosh
+2 votes
Hello. I am using OSX and face the same problem. Any idea how to solve it?
answered Sep 10, 2018 by Sundar
Follow these steps replacing the path and Hadoop version where appropriate

wget http://www.eu.apache.org/dist/hadoop/common/hadoop-2.7.1/hadoop-2.7.1-src.tar.gz
tar xvf hadoop-2.7.1-src.tar.gz
cd hadoop-2.7.1-src
mvn package -Pdist,native -DskipTests -Dtar
mv lib /usr/local/Cellar/hadoop/2.7.1/

then update hadoop-env.sh with

export HADOOP_OPTS="$HADOOP_OPTS -Djava.net.preferIPv4Stack=true -Djava.security.krb5.realm= -Djava.security.krb5.kdc= -Djava.library.path=/usr/local/Cellar/hadoop/2.7.1/lib/native"
0 votes

modify the glibc version.CentOS provides safe softwares tranditionally,it also means the version is old such as glibc,protobuf ...

ldd --version
ldd /opt/hadoop/lib/native/libhadoop.so.1.0.0

You can compare the version of current glibc with needed glibc.

Secondly: If the version of current glibc is old,you can update the glibc. DownLoad Glibc

If the version of current glibc id right,you can append word native to your HADOOP_OPTS

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
answered Sep 10, 2018 by bug_seeker
• 14,980 points
0 votes

Make sure you have set proper paths:

# JAVA env
#
export JAVA_HOME=/user/sys/jdk
export JRE_HOME=/user/sys/jdk/jre

export PATH=${JAVA_HOME}/bin:${JRE_HOME}/bin:${PATH}:.

# HADOOP env
#
export HADOOP_HOME=/usr/local/hadoop
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME

export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin
answered Dec 7, 2018 by Gopal
0 votes
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
answered Dec 7, 2018 by Sarthik
0 votes
Update your jdk and try again
answered Dec 7, 2018 by Parth
0 votes

Add this path:

export JAVA_LIBRARY_PATH=$HADOOP_HOME/lib/native:$JAVA_LIBRARY_PATH
source ~./bashrc
answered Dec 7, 2018 by Veeresh

Related Questions In Big Data Hadoop

0 votes
1 answer

Hadoop Warning: Unable to load Native-Hadoop Library for your platform.

Are you sure you're running Hadoop on 32bit ...READ MORE

answered May 28 in Big Data Hadoop by ravikiran
• 2,040 points
26 views
0 votes
1 answer

Hadoop: Can not start namenode: Unable to load native-hadoop library for your platform

The entries in your .bashrc file looks ...READ MORE

answered Dec 5, 2018 in Big Data Hadoop by Omkar
• 66,880 points
96 views
0 votes
1 answer

Getting error when writing to HDFS. Unable to load native-hadoop library for your platform

Try this: sudo service hadoop-master restart After that try ...READ MORE

answered Dec 19, 2018 in Big Data Hadoop by Omkar
• 66,880 points
73 views
0 votes
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,260 points
646 views
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 9,310 points
1,818 views
0 votes
10 answers

hadoop fs -put command?

copy command can be used to copy files ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Sujay
8,988 views
0 votes
1 answer
0 votes
1 answer

How to install Hadoop on Ubuntu?

You can refer to this blog by ...READ MORE

answered Mar 21, 2018 in Big Data Hadoop by nitinrawat895
• 9,310 points
268 views
0 votes
1 answer

How to create a FileSystem object that can be used for reading from and writing to HDFS?

Read operation on HDFS In order to read ...READ MORE

answered Mar 21, 2018 in Big Data Hadoop by nitinrawat895
• 9,310 points

edited Mar 21, 2018 by nitinrawat895 167 views

© 2018 Brain4ce Education Solutions Pvt. Ltd. All rights Reserved.
"PMP®","PMI®", "PMI-ACP®" and "PMBOK®" are registered marks of the Project Management Institute, Inc. MongoDB®, Mongo and the leaf logo are the registered trademarks of MongoDB, Inc.