Whenever I try to start the yarn daemon. ./start-dfs.sh it says access denied I have even tried changing the file permissions but it didn't work.

0 votes
[edureka@localhost sbin]$ start-dfs.sh
bash: start-dfs.sh: command not found
[edureka@localhost sbin]$ ./start-dfs.sh
19/03/08 11:30:46 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [localhost]
edureka@localhost's password:
localhost: starting namenode, logging to /home/edureka/hadoop-2.7.7/logs/hadoop-edureka-namenode-localhost.localdomain.out
localhost: /home/edureka/hadoop-2.7.7/sbin/hadoop-daemon.sh: line 165: /usr/lib/hadoop-2.2.0/hadoop2_data/hdfs/pid/hadoop-edureka-namenode.pid: Permission denied
edureka@localhost's password:
localhost: starting datanode, logging to /home/edureka/hadoop-2.7.7/logs/hadoop-edureka-datanode-localhost.localdomain.out
localhost: /home/edureka/hadoop-2.7.7/sbin/hadoop-daemon.sh: line 165: /usr/lib/hadoop-2.2.0/hadoop2_data/hdfs/pid/hadoop-edureka-datanode.pid: Permission denied
Starting secondary namenodes [0.0.0.0]
edureka@0.0.0.0's password:
0.0.0.0: starting secondarynamenode, logging to /home/edureka/hadoop-2.7.7/logs/hadoop-edureka-secondarynamenode-localhost.localdomain.out
0.0.0.0: /home/edureka/hadoop-2.7.7/sbin/hadoop-daemon.sh: line 165: /usr/lib/hadoop-2.2.0/hadoop2_data/hdfs/pid/hadoop-edureka-secondarynamenode.pid: Permission denied
19/03/08 11:31:20 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Mar 8 in Big Data Hadoop by Kunal
• 120 points

edited Mar 8 by Omkar 552 views

1 answer to this question.

0 votes

Seems like the ssh is not properly set. Try this:

Create the SSH Key in the master node. (Press enter button when it asks you to enter a filename to save the key).

$ ssh-keygen -t rsa -P “”

Copy the generated ssh key to the master node’s authorized keys.

$ cat $HOME/.ssh/id_rsa.pub >> $HOME/.ssh/authorized_keys

Copy the master node’s ssh key to the slave’s authorized keys.

$ ssh-copy-id -i $HOME/.ssh/id_rsa.pubedureka@slave

Note: If you have used a different username, then replace it in the above commands respectively. 

answered Mar 8 by Omkar
• 67,660 points
This error is being displayed when I am executing third command.

ssh: Could not resolve hostname slave: Temporary failure in name resolution

Seems like the names mentioned in the /etc/hosts file are not right. Suppose the IP address of the master node is 192.168.2.1 and that of the slave is 192.168.2.2, then the entries in the /etc/hosts file of the master should be as follows:

192.168.2.1 master
192.168.2.2 slave

And the entries in the /etc/hosts file of the slave should be as follows:

192.168.2.1 master

After this, restart the sshd service:

$ service sshd restart

and then retry the steps in the above comment. 

NOTE: Make sure you use the hostnames of the system you are using (in place of master and slave)

Related Questions In Big Data Hadoop

0 votes
1 answer
0 votes
1 answer

Apache Hadoop Yarn example program

You can go to this location $Yarn_Home/share/hadoop/mapreduce . You'll ...READ MORE

answered Apr 4, 2018 in Big Data Hadoop by nitinrawat895
• 10,730 points
243 views
0 votes
1 answer

What do we exactly mean by “Hadoop” – the definition of Hadoop?

The official definition of Apache Hadoop given ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by Shubham
230 views
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,730 points
3,369 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,730 points
405 views
0 votes
1 answer

Hadoop hdfs "permission denied" error when I try to make new directory

This is because you dont have enough ...READ MORE

answered Nov 22, 2018 in Big Data Hadoop by Omkar
• 67,660 points
1,703 views
0 votes
1 answer

Start-dfs.sh daemons not starting. localhost: mkdir: cannot create directory `/user': Permission denied

Make the following changes to the hadoop-env.sh ...READ MORE

answered Dec 5, 2018 in Big Data Hadoop by Omkar
• 67,660 points
252 views