Changing hostname in property puts Hadoop in safemode

0 votes

I have a CDH pseudo-distributed mode Hadoop cluster. It was working fine. Then as I was studying about the configuration files, I came across property present in core-site.xml file. Earlier my host was localhost, but I replaced it with hadoop.

After restarting the Hadoop daemons, Hadoop show that it is in safemode.

I got the following output:

$ hadoop dfsadmin -report
Safe mode is ON
Configured Capacity: 0 (0 B)
Present Capacity: 0 (0 B)
DFS Remaining: 0 (0 B)
DFS Used: 0 (0 B)
DFS Used%: NaN%
Under replicated blocks: 0
Blocks with corrupt replicas: 0
Missing blocks: 0

I am not able to execute -cat or -put command. It is showing that NameNode is in safe mode. Can anyone help in understanding how I can keep the hostname as Hadoop so that external systems can connect to it & my NameNode does not enter safe mode.

Apr 27, 2018 in Big Data Hadoop by coldcode
• 1,980 points

1 answer to this question.

Your answer

Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
0 votes

First of all, in Safe mode the HDFS filesystem is in read-only mode. So, all the read only applications can be executed but operations like write, create, delete cannot be performed.

Next, your hostname issue is because you machine cannot resolve your domain name. To do so, you need to append this in your file /etc/hosts: (your own ip address) localhost
answered Apr 27, 2018 by Shubham
• 12,790 points

Related Questions In Big Data Hadoop

0 votes
1 answer
0 votes
1 answer

“no such file or directory" in case of hadoop fs -ls

The behaviour that you are seeing is ...READ MORE

answered May 9, 2018 in Big Data Hadoop by nitinrawat895
• 9,350 points

edited May 9, 2018 by nitinrawat895 1,813 views
0 votes
1 answer

Can I have a list of property files used in Hadoop Framework?

Here is a complete list of configuration ...READ MORE

answered Aug 14, 2018 in Big Data Hadoop by Frankie
• 9,710 points
+1 vote
1 answer

What is the technique to know the Default scheduler in hadoop?

Default scheduler in hadoop is JobQueueTaskScheduler, which is ...READ MORE

answered Oct 30, 2018 in Big Data Hadoop by Frankie
• 9,710 points
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 9,350 points
0 votes
10 answers

hadoop fs -put command?

copy command can be used to copy files ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Sujay
0 votes
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,260 points
0 votes
1 answer
0 votes
1 answer

Meaning of fs.defaultFS property in core-site.xml in hadoop property in core-site.xml tells the address ...READ MORE

answered Apr 15, 2018 in Big Data Hadoop by Shubham
• 12,790 points
+1 vote
2 answers

How to authenticate username & password while using Connector for Cloudera Hadoop in Tableau?

Hadoop server installed was kerberos enabled server. ...READ MORE

answered Aug 21, 2018 in Big Data Hadoop by Priyaj
• 56,140 points

© 2018 Brain4ce Education Solutions Pvt. Ltd. All rights Reserved.
"PMP®","PMI®", "PMI-ACP®" and "PMBOK®" are registered marks of the Project Management Institute, Inc. MongoDB®, Mongo and the leaf logo are the registered trademarks of MongoDB, Inc.