Limit for Namenode Quantity

0 votes
Assuming the size of one name node to be 32 GB, How many files can I store in a single name node?
Jul 25 in Big Data Hadoop by nitinrawat895
• 10,490 points
10 views

1 answer to this question.

0 votes
Each file Schema = 150bytes  

Block schema for the file=150bytes

1million files each with 1 block will consume=300*1000000=300000000bytes =300MB provided with a replication factor of 1.

And with a replication factor of 3, it requires 900MB.

For every 1GB you can store 1million files.
answered Jul 25 by ravikiran
• 4,200 points

Related Questions In Big Data Hadoop

0 votes
1 answer

Hadoop: Can not start namenode: Unable to load native-hadoop library for your platform

The entries in your .bashrc file looks ...READ MORE

answered Dec 5, 2018 in Big Data Hadoop by Omkar
• 67,290 points
139 views
0 votes
1 answer

Setting user limit for Hbase?

Hey, Because HBase is a database, it opens ...READ MORE

answered May 29 in Big Data Hadoop by Gitika
• 25,300 points
15 views
+1 vote
2 answers

Failed to restart Hadoop namenode using cloudera quickstart

You can use cloudera manager to manage ...READ MORE

answered Mar 19, 2018 in Big Data Hadoop by kurt_cobain
• 9,240 points
628 views
0 votes
1 answer

How can I download hadoop documentation for a specific version?

You can go through this SVN link:- ...READ MORE

answered Mar 21, 2018 in Big Data Hadoop by Shubham
• 13,290 points
75 views
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,490 points
2,315 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,490 points
237 views
0 votes
10 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyFr ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
11,915 views
0 votes
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,240 points
881 views
0 votes
1 answer

Can I use Amazon Web Services for free?

If you would like to use AWS ...READ MORE

answered Apr 25 in Big Data Hadoop by ravikiran
• 4,200 points
49 views
0 votes
1 answer

Hadoop Warning: Unable to load Native-Hadoop Library for your platform.

Are you sure you're running Hadoop on 32bit ...READ MORE

answered May 28 in Big Data Hadoop by ravikiran
• 4,200 points
66 views