Hadoop daemons not starting

0 votes

I have installed Hadoop in linux cluster. When I try to start the server by the command $bin/start-all.sh, I get following errors:

mkdir: cannot create directory `/var/log/hadoop/hduser': Permission denied

chown: cannot access `/var/log/hadoop/hduser': No such file or directory

/home/hduser/spring_2012/Hadoop/hadoop/hadoop-1.0.2/bin/hadoop-daemon.sh: line 136: /var/run/hadoop/hadoop-hduser-namenode.pid: Permission denied

head: cannot open `/var/log/hadoop/hduser/hadoop-hduser-namenode-gpu02.cluster.out' for reading: No such file or directory

localhost: /home/hduser/.bashrc: line 10: /act/Modules/3.2.6/init/bash: No such file or directory

localhost: mkdir: cannot create directory `/var/log/hadoop/hduser': Permission denied

localhost: chown: cannot access `/var/log/hadoop/hduser': No such file or directory

Jan 11 in Big Data Hadoop by digger
• 27,630 points
97 views

1 answer to this question.

0 votes

You have to write this directory in "core.site.xml" file not in hadoop-env.sh

<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>
<property>
  <name>hadoop.tmp.dir</name>
  <value>/Directory_hadoop_user_have_permission/temp/${user.name}</value>
  <description>A base for other temporary directories.</description>
</property>

<property>
  <name>fs.default.name</name>
  <value>hdfs://localhost:54310</value>
  <description>The name of the default file system.  A URI whose
  scheme and authority determine the FileSystem implementation.  The
  uri's scheme determines the config property (fs.SCHEME.impl) naming
  the FileSystem implementation class.  The uri's authority is used to
  determine the host, port, etc. for a filesystem.</description>
</property>

</configuration>
answered Jan 11 by Omkar
• 67,380 points

Related Questions In Big Data Hadoop

0 votes
3 answers

Cloudera Hadoop - Daemons not running

Please run below mentioned command. It will ...READ MORE

answered Aug 7, 2018 in Big Data Hadoop by Priyaj
• 56,520 points
575 views
0 votes
1 answer
0 votes
2 answers

start-dfs.sh command is not starting Hadoop JobTracker & TaskTracker

On which version of hadoop do you ...READ MORE

answered Jul 24 in Big Data Hadoop by Lokesh Singh
646 views
0 votes
1 answer

Best way of starting & stopping the Hadoop daemons with command line

First way is to use start-all.sh & ...READ MORE

answered Apr 15, 2018 in Big Data Hadoop by Shubham
• 13,290 points
1,481 views
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,670 points
2,649 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,670 points
275 views
0 votes
10 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyFr ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
13,208 views
0 votes
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,240 points
972 views
0 votes
1 answer

Hadoop: Datanode not starting correctly

You can do the following method, copy to ...READ MORE

answered Nov 5, 2018 in Big Data Hadoop by Omkar
• 67,380 points
373 views
0 votes
1 answer

Start-dfs.sh daemons not starting. localhost: mkdir: cannot create directory `/user': Permission denied

Make the following changes to the hadoop-env.sh ...READ MORE

answered Dec 5, 2018 in Big Data Hadoop by Omkar
• 67,380 points
169 views