if HDFS threshold has been reached What is the our approach to resolve this issue

0 votes
Jun 21, 2019 in Big Data Hadoop by Manish
1,519 views

2 answers to this question.

0 votes

Datanodes fill the disk unevenly. Most of the time, certain disks are full while others still have space in them. You can use the tool called DiskBalancer to solve this issue. The Disk Balancer lets administrators rebalance data across multiple disks of a DataNode. To know more, refer to this: https://issues.apache.org/jira/browse/HDFS-1312

answered Jun 24, 2019 by Rishi
0 votes

This was a fundamental issue in HDFS for a long time, but there is a new tool called DiskBalancer. It essentially allows you to create a plan file -- That describes how data will be moved from disk to disk and then you can ask a datanode to execute it.

 If one disk is over-utilized, some writes will fail, that is when the datanode picks that disk. So you need to make sure data is similarly distributed in each of the disks. That is what DiskBalancer does for you, it computes how much to move based on each disk type.

answered Jun 24, 2019 by Gitika
• 65,770 points

Related Questions In Big Data Hadoop

0 votes
5 answers
0 votes
1 answer

What is the command to navigate in HDFS?

First of all there is no command ...READ MORE

answered Apr 27, 2018 in Big Data Hadoop by Shubham
• 13,490 points
5,712 views
0 votes
1 answer

What is the command to find the free space in HDFS?

You can use dfsadmin which runs a ...READ MORE

answered Apr 29, 2018 in Big Data Hadoop by Shubham
• 13,490 points
2,231 views
0 votes
1 answer

What is the standard way to create files in your hdfs file-system?

Well, it's so easy. Just enter the below ...READ MORE

answered Sep 23, 2018 in Big Data Hadoop by Frankie
• 9,830 points
2,614 views
0 votes
1 answer

Apache Hadoop Yarn example program

You can go to this location $Yarn_Home/share/hadoop/mapreduce . You'll ...READ MORE

answered Apr 4, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
1,232 views
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
11,078 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
2,575 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
109,078 views
0 votes
1 answer

Why we need to install "ant -Dhadoopversion=23" ? What is the use of this?

Hi, We install ant to build pig, If the ...READ MORE

answered Apr 26, 2019 in Big Data Hadoop by Gitika
• 65,770 points
679 views
0 votes
1 answer

What will happen if the OOZIE_URL environment variable has not been set?

Hey, If the Oozie_URL environment variable has not ...READ MORE

answered Jun 24, 2019 in Big Data Hadoop by Gitika
• 65,770 points
1,206 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP