Not able to start hadoop dfs

+2 votes

I am trying to install Hadoop 2.2.0 Cluster on the servers. All the servers are 64-bit. When I am running ./start-dfs.sh, I got the following error:

13/11/15 14:29:26 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/hchen/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.namenode]
sed: -e expression #1, char 6: unknown option to `s' have: ssh: Could not resolve hostname have: Name or service not known
HotSpot(TM): ssh: Could not resolve hostname HotSpot(TM): Name or service not known
-c: Unknown cipher type 'cd'
Java: ssh: Could not resolve hostname Java: Name or service not known
The authenticity of host 'namenode (192.168.1.62)' can't be established.
RSA key fingerprint is 65:f9:aa:7c:8f:fc:74:e4:c7:a2:f5:7f:d2:cd:55:d4.
Are you sure you want to continue connecting (yes/no)? VM: ssh: Could not resolve        hostname VM: Name or service not known
You: ssh: Could not resolve hostname You: Name or service not known
warning:: ssh: Could not resolve hostname warning:: Name or service not known
library: ssh: Could not resolve hostname library: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
64-Bit: ssh: Could not resolve hostname 64-Bit: Name or service not known
...
Oct 22, 2018 in Big Data Hadoop by slayer
• 29,370 points

recategorized Oct 25, 2018 by Omkar 3,140 views
I am installing Hadoop. I downloaded jdk and hadoop and set path in bashrc file. I execute source .bashrc to load vairables and i get error

bash: export: `=': not a valid identifier
bash: export: `/home/user/jdk1.8.0_181/bin:/home/user/jdk1.8.0_181/bin:/home/user/jdk1.8.0_181/bin:/home/user/jdk1.8.0_181/bin:/home/user/jdk1.8.0_181/bin:/home/user/jdk1.8.0_181/bin:/home/user/jdk1.8.0_181/bin:/usr/local/bin:/usr/local/sbin:/usr/bin:/usr/sbin:/bin:/sbin:/home/user/hadoop-2.8.5/bin:/usr/lib/sqoop/bin:/home/user/.local/bin:/home/user/bin:/home/user/hadoop-2.8.5/bin:/usr/lib/sqoop/bin:/home/user/hadoop-2.8.5/bin:/home/user/hadoop-2.8.5/bin:/home/user/hadoop-2.8.5/bin:/usr/lib/sqoop/bin:/home/user/hadoop-2.8.5/bin:/usr/lib/sqoop/bin:/home/user/hadoop-2.8.5/bin:/usr/lib/sqoop/bin:/home/user/hadoop-2.8.5/bin:/usr/lib/sqoop/bin:/home/user/hadoop-2.8.5/pig/bin:/home/user/hadoop-2.8.5/bin': not a valid identifier

after this error whenever i open termnal i get same error. I reboot system but same error

Hey there! Could you please share the entries you made in your .bashrc file?

5 answers to this question.

0 votes

Add the following to .bashrc. Replace HADOOP_HOME with the path to your hadoop folder:

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"

Then run the following commands:

ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys

This should solve your problem

answered Oct 22, 2018 by Omkar
• 69,220 points
0 votes
This should be helpful:

https://gist.github.com/ruo91/7154697
answered Oct 25, 2018 by anonymous
0 votes

Export variables in hadoop-env.sh like this:

vim /usr/local/hadoop/etc/hadoop/hadoop-env.sh

hadoop-env.sh can be found in /$HADOOP_HOME/etc/hadoop/

#Hadoop variables
export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-amd64 # your jdk install path
export HADOOP_INSTALL=/usr/local/hadoop
export PATH=$PATH:$HADOOP_INSTALL/bin
export PATH=$PATH:$HADOOP_INSTALL/sbin
export HADOOP_MAPRED_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_HOME=$HADOOP_INSTALL
export HADOOP_HDFS_HOME=$HADOOP_INSTALL
export YARN_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib"
answered Oct 25, 2018 by Manjunath
0 votes

To solve the ssh error, you can do this:

ssh -p 22 myName@hostname

or

ssh -l myName -p 22 hostname
answered Oct 25, 2018 by Shaam
0 votes

You can re-install openssh-client and openssh-server:

$ sudo apt-get autoremove openssh-client openssh-server
$ sudo apt-get install -f openssh-client openssh-server
answered Oct 25, 2018 by Jino

Related Questions In Big Data Hadoop

0 votes
1 answer

Not able to start Job History Server in Hadoop 2.8.1

You have to start JobHistoryServer process specifically ...READ MORE

answered Mar 30, 2018 in Big Data Hadoop by Ashish
• 2,650 points
2,597 views
+1 vote
1 answer

Not able to start datanode in Hadoop

Follow these steps: Stop namenode Delete the datanode directory ...READ MORE

answered Dec 12, 2018 in Big Data Hadoop by Omkar
• 69,220 points
1,727 views
0 votes
1 answer

Not able to start/stop hadoop daemons

If you were able to start the ...READ MORE

answered Dec 20, 2018 in Big Data Hadoop by Omkar
• 69,220 points
1,925 views
0 votes
2 answers

start-dfs.sh command is not starting Hadoop JobTracker & TaskTracker

On which version of hadoop do you ...READ MORE

answered Jul 24, 2019 in Big Data Hadoop by Lokesh Singh
3,878 views
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
11,078 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
2,575 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
109,076 views
–1 vote
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,350 points
4,643 views
0 votes
1 answer

Hadoop: Not able to start Namenode

Stop all running server 1) stop-all.sh Edit the ...READ MORE

answered Nov 8, 2018 in Big Data Hadoop by Omkar
• 69,220 points
2,418 views
0 votes
1 answer

Not able to start jobtracker in hadoop

There is no jobtracker in hadoop 2.2.0 YARN framework. ...READ MORE

answered Dec 11, 2018 in Big Data Hadoop by Omkar
• 69,220 points
1,442 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP