Hadoop Pipe shared libraries No such file or directory error

0 votes

I am trying to run the following code:

CC = g++
HADOOP_PATH = usr/lib/HADOOP
OTHERLIB1_PATH = usr/lib/OTHERLIB1
OTHERLIB2_PATH = usr/lib/OTHERLIB2
OTHERLIB3_PATH = usr/lib/OTHERLIB3
OTHERLIB4_PATH = usr/lib/OTHERLIB4
IMAGE_PATH = usr/lib/IMAGE
LIB_PATH = ../../../src/Lib
PLATFORM = Linux-amd64-64

CFLAGS_HDP =  -O3 \
        -I$(LIB_PATH) \
        -I$(OTHERLIB1_PATH)/include \
        -I$(HADOOP_PATH)/$(PLATFORM)/include \
        -I$(OTHERLIB4_PATH)/include \
        -I$(OTHERLIB2_PATH)/include \
        -I$(OTHERLIB3_PATH)/include 

LDFLAGS_HDP =   -L$(OTHERLIB1_PATH)/lib \
        -L$(HADOOP_PATH)/$(PLATFORM)/lib \
        -L$(OTHERLIB3_PATH)/lib \
        -L$(OTHERLIB2_PATH)/lib \
        -L$(OTHERLIB4_PATH)/lib \
        -L$(LIB_PATH)/.libs \
        -lhadooppipes -lhadooputils -lpthread -lcrypto\
        -lLib -lLib4 -lLib1

all: pipes clean 

clean:
        rm  *.o

pipes: LibPipes.cpp xml DocToXml
    $(CC) $(CFLAGS_HDP) \
    LibPipes.cpp \
    -o Lib_Pipes base64.o \
    xml.o DocToXml.o $(LDFLAGS_HDP)


xml: xml.cpp base64
        $(CC) $(CFLAGS_HDP) base64.o -c xml.cpp -o xml.o

base64: base64.cpp
        $(CC) $(CFLAGS_HDP) -c base64.cpp -o base64.o

DocToXml: DocToXml.cpp
        $(CC) $(CFLAGS_HDP) -c DocToXml.cpp -o  DocToXml.o

I run the code as follows:

hadoop pipes \
-D hadoop.pipes.java.recordreader=false \
-D hadoop.pipes.java.recordwriter=false \
-D mapred.map.tasks=128 \
-inputformat org.apache.hadoop.mapred.SequenceFileInputFormat \
-writer org.apache.hadoop.mapred.SequenceFileOutputFormat \
-reduce org.apache.hadoop.mapred.lib.IdentityReducer \
-input Input \
-output Output \
-program /user/uss/bin/Lib_Pipes \
-reduces 1

I get the following error:

error while loading shared libraries: Lib.so.0: cannot open shared object file: No such file or directory
Aug 30, 2018 in Big Data Hadoop by Neha
• 6,300 points
814 views

1 answer to this question.

0 votes

Add -rpath my.zip/lib to LDFLAGS_HDP. For compilation to work, you will also need to do the following.

# in src dir
mkdir -p my.zip/lib

Then use your command to run pipes and add in

-archives my.zip
answered Aug 30, 2018 by Frankie
• 9,830 points

Related Questions In Big Data Hadoop

0 votes
1 answer
0 votes
1 answer

Hadoop: Error: slave: bash: line 0: cd: /opt/hadoop-1.1.0/libexec/..: No such file or directory

Hi,   In this case, it is searching ...READ MORE

answered Aug 7, 2019 in Big Data Hadoop by Gitika
• 65,910 points
1,067 views
0 votes
1 answer

“no such file or directory" in case of hadoop fs -ls

The behaviour that you are seeing is ...READ MORE

answered May 9, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points

edited May 9, 2018 by nitinrawat895 7,805 views
0 votes
1 answer

Hadoop: `.' no such file or directory while installing

Try hadoop fs -mkdir -p /user/[Username] and then run ...READ MORE

answered Nov 2, 2018 in Big Data Hadoop by Omkar
• 69,210 points
1,314 views
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
10,556 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
104,201 views
–1 vote
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,390 points
4,260 views
0 votes
1 answer
0 votes
1 answer
+1 vote
1 answer
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP