Help me solve this error Unable to load Hadoop Native library for your platform

0 votes

I'm currently configuring Hadoop on a server running CentOs. When I run start-dfs.sh or stop-dfs.sh, I get the following error:

#WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

I'm running Hadoop 2.2.0.

However, the contents of /native/ directory on Hadoop 2.x appear to be different so I am not sure what to do.

I've also added these two environment variables in Hadoop-env.sh:

#export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=/usr/local/hadoop/lib/"
#export HADOOP_COMMON_LIB_NATIVE_DIR="/usr/local/hadoop/lib/native/"

Can any one help me out?

Jun 20, 2019 in Big Data Hadoop by nitinrawat895
• 11,380 points
3,346 views

1 answer to this question.

0 votes

I hope I understood your query properly

I just installed Hadoop 2.6 from a tarball on 64-bit CentOS 6.6. The Hadoop install did indeed come with a prebuilt 64-bit native library. For my install, it is here:

/opt/hadoop/lib/native/libhadoop.so.1.0.0

And I know it is 64-bit:

[hadoop@VMWHADTEST01 native]$ ldd libhadoop.so.1.0.0
./libhadoop.so.1.0.0: /lib64/libc.so.6: version `GLIBC_2.14' not found (required by ./libhadoop.so.1.0.0)
linux-vdso.so.1 =>  (0x00007fff43510000)
libdl.so.2 => /lib64/libdl.so.2 (0x00007f9be553a000)
libc.so.6 => /lib64/libc.so.6 (0x00007f9be51a5000)
/lib64/ld-linux-x86-64.so.2 (0x00007f9be5966000)

Unfortunately, I stupidly overlooked the answer right there staring me in the face as I was focused on, "Is this library 32 or 64 bit?":

`GLIBC_2.14' not found (required by ./libhadoop.so.1.0.0)

So, lesson learned. Anyway, the rest at least led me to being able to suppress the warning. So I continued and did everything recommended in the other answers to provide the library path using the HADOOP_OPTS environment variable to no avail. So I looked at the source code. The module that generates the error tells you the hint (util.NativeCodeLoader):

15/06/18 18:59:23 WARN util.NativeCodeLoader: Unable to load native-hadoop    library for your platform... using builtin-java classes where applicable

Ah, there is some debug level logging - let's turn that on a see if we get some additional help. This is done by adding the following line to $HADOOP_CONF_DIR/log4j.properties file:

log4j.logger.org.apache.hadoop.util.NativeCodeLoader=DEBUG

Then I ran a command that generates the original warning, like stop-dfs.sh, and got this goodie:

15/06/18 19:05:19 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: /opt/hadoop/lib/native/libhadoop.so.1.0.0: /lib64/libc.so.6: version `GLIBC_2.14' not found (required by /opt/hadoop/lib/native/libhadoop.so.1.0.0)

And the answer is revealed in this snippet of the debug message (the same thing that the previous ldd command 'tried' to tell me:

`GLIBC_2.14' not found (required by opt/hadoop/lib/native/libhadoop.so.1.0.0)

What version of GLIBC do I have? Here's a simple trick to find out:

[hadoop@VMWHADTEST01 hadoop]$ ldd --version
ldd (GNU libc) 2.12

So, can't update my OS to 2.14. The only solution is to build the native libraries from sources on my OS or suppress the warning and just ignore it for now. I opted to just suppress the annoying warning for now (but do plan to build from sources in the future) by using the same logging options we used to get the debug message, except now, just make it ERROR level.

log4j.logger.org.apache.hadoop.util.NativeCodeLoader=ERROR

I hope this helps others see that a big benefit of open source software is that you can figure this stuff out if you take some simple logical steps.

answered Jun 20, 2019 by ravikiran
• 4,620 points

Related Questions In Big Data Hadoop

–1 vote
1 answer

Getting error when writing to HDFS. Unable to load native-hadoop library for your platform

Try this: sudo service hadoop-master restart After that try ...READ MORE

answered Dec 19, 2018 in Big Data Hadoop by Omkar
• 69,220 points
1,507 views
0 votes
1 answer

Error: Unable to load native-hadoop library for your platform.

Hi,  You should execute the command hdfs dfs -ls ...READ MORE

answered Jun 17, 2019 in Big Data Hadoop by Gitika
• 65,770 points
1,039 views
+11 votes
11 answers

Hadoop “Unable to load native-hadoop library for your platform” warning

modify the glibc version.CentOS provides safe softwares ...READ MORE

answered Sep 10, 2018 in Big Data Hadoop by bug_seeker
• 15,510 points
72,206 views
0 votes
1 answer

Hadoop: Can not start namenode: Unable to load native-hadoop library for your platform

The entries in your .bashrc file looks ...READ MORE

answered Dec 5, 2018 in Big Data Hadoop by Omkar
• 69,220 points
2,677 views
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
11,078 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
2,575 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
109,081 views
–1 vote
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,350 points
4,644 views
0 votes
1 answer
0 votes
1 answer

Hadoop Warning: Unable to load Native-Hadoop Library for your platform.

Are you sure you're running Hadoop on 32bit ...READ MORE

answered May 28, 2019 in Big Data Hadoop by ravikiran
• 4,620 points
1,737 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP