How do you handle the failure of Data node?

0 votes

When we write a file in Hadoop and if there is any failure happens to Data node how its going to get handled?

Apr 9, 2019 in Big Data Hadoop by anonymous
133 views

1 answer to this question.

0 votes

HDFS has Master Slave architecture in which Master is Namenode and slave is Datanode. 

HDFS cluster has single namenode that manages file system namespace (metadata) and multiple datanodes that are responsible for storing actual data in HDFS and performing read-write operation.

In HDFS, Each Datanode in the cluster sends a heartbeat in an interval of specified time to the Namenode. If it receives any heartbeat that means the Datanodes are working properly. 

If the Namenode doesn’t receive the heartbeat signal, it assumes that either Datanode is dead or non-functioning properly. As soon as the datanodes are declared dead. Data blocks on the failed Datanode are replicated on other Datanodes based on the specified replication factor in hdfs-site.xml file. 

Once the failed datanodes comes back the Name node will manage the replication factor again. This is how Namenode handles the failure of data node. 

answered Apr 9, 2019 by Gitika
• 25,950 points

edited Apr 9, 2019 by Gitika

Related Questions In Big Data Hadoop

+3 votes
0 answers

How to handle Sqoop failure program and restart the job from failed tables?

When running a Sqoop jobs on a ...READ MORE

Aug 29, 2019 in Big Data Hadoop by Kunal
• 150 points

edited Sep 5, 2019 by Kunal 292 views
0 votes
1 answer

What do we exactly mean by “Hadoop” – the definition of Hadoop?

The official definition of Apache Hadoop given ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by Shubham
334 views
0 votes
1 answer

How do I include all the Hadoop dependencies using Maven?

This is a dependency mismatch error. I ...READ MORE

answered Apr 10, 2018 in Big Data Hadoop by Shubham
• 13,370 points
1,108 views
0 votes
1 answer

How to set the number of Map & Reduce tasks?

The map tasks created for a job ...READ MORE

answered Apr 18, 2018 in Big Data Hadoop by Shubham
• 13,370 points
127 views
0 votes
1 answer

How to print the content of a file in console present in HDFS?

Yes, you can use hdfs dfs command ...READ MORE

answered Apr 19, 2018 in Big Data Hadoop by Shubham
• 13,370 points
891 views
0 votes
1 answer

How to extract only few lines of data from HDFS?

Here also in case of Hadoop, it is ...READ MORE

answered May 2, 2018 in Big Data Hadoop by nitinrawat895
• 10,840 points
2,127 views
0 votes
1 answer
0 votes
1 answer

How can you Check the xml definition of a Workflow, Coordinator or Bundle Job in Oozie?

Hey, You can use this example so that ...READ MORE

answered Jun 24, 2019 in Big Data Hadoop by Gitika
• 25,950 points
52 views