ERROR Cannot set priority of datanode process

0 votes

Hi Team,

I am new to Hadoop. I am trying to install Hadoop on ubuntu. But while starting the Hadoop daemons, it is showing me the following error.

ERROR: Cannot set priority of datanode process
Oct 15, 2020 in Big Data Hadoop by akhtar
• 38,230 points
10,685 views

1 answer to this question.

0 votes

Hi@akhtar,

You need to set JAVA_HOME and HADOOP_HOME in the .bashrc file. Your Hadoop daemons may not able to find these environment variables.

answered Oct 15, 2020 by MD
• 95,440 points

Related Questions In Big Data Hadoop

0 votes
1 answer

How to set the number of Map & Reduce tasks?

The map tasks created for a job ...READ MORE

answered Apr 18, 2018 in Big Data Hadoop by Shubham
• 13,490 points
1,616 views
0 votes
1 answer

out of Memory Error in Hadoop

See if you can share the screenshot ...READ MORE

answered May 22, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
1,635 views
0 votes
1 answer

What happens in a MapReduce job when you set the number of reducers to one?

If you set number of reducers as ...READ MORE

answered Jul 31, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
1,832 views
0 votes
1 answer
0 votes
1 answer

What is the command to find the free space in HDFS?

You can use dfsadmin which runs a ...READ MORE

answered Apr 29, 2018 in Big Data Hadoop by Shubham
• 13,490 points
1,871 views
0 votes
1 answer

How to find the used cache in HDFS

hdfs dfsadmin -report This command tells fs ...READ MORE

answered May 4, 2018 in Big Data Hadoop by Shubham
• 13,490 points
2,057 views
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
10,618 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
2,215 views
0 votes
1 answer

Error while starting the daemon process in windows 10

Hi@Arun, By default, the parameter "dfs.datanode.failed.volumes.tolerated" is set ...READ MORE

answered Apr 20, 2020 in Big Data Hadoop by MD
• 95,440 points
7,932 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP