How to calculate number of slave nodes

0 votes
I m planning a Hadoop cluster, and I expect to be receiving just under 1TB of data per week which will be stored on the cluster, using Hadoop's default replication. So, what I decide is that all the slave nodes will be configured with 4 x 1TB disks.

But is there any way to calculate how many slave nodes I need to deploy at a minimum to store one year's worth of data.

Thanks in advance!
Aug 16, 2018 in Big Data Hadoop by Shubham
• 13,490 points
1,010 views

1 answer to this question.

0 votes
Total number of slave nodes in your case are:

So, first you have to determine the total number of space required.

Total space required is : 52 (weeks) * 1 (disk space per week) * 3 (default replication factor) = 156 TB
 
Minimum number of slave nodes required: 156 /4 = 39

Hope this helps.
answered Aug 16, 2018 by nitinrawat895
• 11,380 points

Related Questions In Big Data Hadoop

0 votes
1 answer

How to set the number of Map & Reduce tasks?

The map tasks created for a job ...READ MORE

answered Apr 18, 2018 in Big Data Hadoop by Shubham
• 13,490 points
1,609 views
+2 votes
1 answer

How to calculate Maximum salary of the employee with the name using the Map Reduce Technique

Please try the below code and it ...READ MORE

answered Jul 25, 2018 in Big Data Hadoop by Neha
• 6,300 points
5,271 views
+1 vote
1 answer

How to count number of rows in alias in PIG?

COUNT is part of pig LOGS= LOAD 'log'; LOGS_GROUP= ...READ MORE

answered Oct 15, 2018 in Big Data Hadoop by Omkar
• 69,210 points
2,430 views
0 votes
1 answer

Hadoop hdfs: How to count number of lines?

You can count the number of lines ...READ MORE

answered Nov 19, 2018 in Big Data Hadoop by Omkar
• 69,210 points
928 views
–1 vote
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,390 points
4,286 views
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
10,602 views
0 votes
1 answer

How to get started with Hadoop?

Well, hadoop is actually a framework that ...READ MORE

answered Mar 21, 2018 in Big Data Hadoop by coldcode
• 2,080 points
915 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
104,779 views
0 votes
1 answer

How Impala is fast compared to Hive in terms of query response?

Impala provides faster response as it uses MPP(massively ...READ MORE

answered Mar 21, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
1,884 views
0 votes
1 answer

How to extract only few lines of data from HDFS?

Here also in case of Hadoop, it is ...READ MORE

answered May 2, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
10,804 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP