How to calculate number of slave nodes?

0 votes
I m planning a Hadoop cluster, and I expect to be receiving just under 1TB of data per week which will be stored on the cluster, using Hadoop's default replication. So, what I decide is that all the slave nodes will be configured with 4 x 1TB disks.

But is there any way to calculate how many slave nodes I need to deploy at a minimum to store one year's worth of data.

Thanks in advance!
Aug 16, 2018 in Big Data Hadoop by Shubham
• 13,110 points
25 views

1 answer to this question.

0 votes
Total number of slave nodes in your case are:

So, first you have to determine the total number of space required.

Total space required is : 52 (weeks) * 1 (disk space per week) * 3 (default replication factor) = 156 TB
 
Minimum number of slave nodes required: 156 /4 = 39

Hope this helps.
answered Aug 16, 2018 by nitinrawat895
• 10,030 points

Related Questions In Big Data Hadoop

0 votes
1 answer

How to set the number of Map & Reduce tasks?

The map tasks created for a job ...READ MORE

answered Apr 18, 2018 in Big Data Hadoop by Shubham
• 13,110 points
45 views
+2 votes
1 answer

How to calculate Maximum salary of the employee with the name using the Map Reduce Technique

Please try the below code and it ...READ MORE

answered Jul 25, 2018 in Big Data Hadoop by Neha
• 6,260 points
296 views
+1 vote
1 answer

How to count number of rows in alias in PIG?

COUNT is part of pig LOGS= LOAD 'log'; LOGS_GROUP= ...READ MORE

answered Oct 15, 2018 in Big Data Hadoop by Omkar
• 67,120 points
48 views
0 votes
1 answer

Hadoop hdfs: How to count number of lines?

You can count the number of lines ...READ MORE

answered Nov 19, 2018 in Big Data Hadoop by Omkar
• 67,120 points
65 views
0 votes
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,240 points
753 views
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,030 points
2,029 views
0 votes
1 answer

How to get started with Hadoop?

Well, hadoop is actually a framework that ...READ MORE

answered Mar 21, 2018 in Big Data Hadoop by coldcode
• 2,010 points
60 views
0 votes
10 answers

hadoop fs -put command?

copy command can be used to copy files ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Sujay
10,275 views
0 votes
1 answer

How Impala is fast compared to Hive in terms of query response?

Impala provides faster response as it uses MPP(massively ...READ MORE

answered Mar 21, 2018 in Big Data Hadoop by nitinrawat895
• 10,030 points
223 views
0 votes
1 answer

How to extract only few lines of data from HDFS?

Here also in case of Hadoop, it is ...READ MORE

answered May 2, 2018 in Big Data Hadoop by nitinrawat895
• 10,030 points
717 views