Limit for Namenode Quantity

0 votes
Assuming the size of one name node to be 32 GB, How many files can I store in a single name node?
Jul 25 in Big Data Hadoop by nitinrawat895
• 10,690 points
13 views

1 answer to this question.

0 votes
Each file Schema = 150bytes  

Block schema for the file=150bytes

1million files each with 1 block will consume=300*1000000=300000000bytes =300MB provided with a replication factor of 1.

And with a replication factor of 3, it requires 900MB.

For every 1GB you can store 1million files.
answered Jul 25 by ravikiran
• 4,560 points

Related Questions In Big Data Hadoop

0 votes
1 answer

Hadoop: Can not start namenode: Unable to load native-hadoop library for your platform

The entries in your .bashrc file looks ...READ MORE

answered Dec 5, 2018 in Big Data Hadoop by Omkar
• 67,600 points
199 views
0 votes
1 answer

Setting user limit for Hbase?

Hey, Because HBase is a database, it opens ...READ MORE

answered May 29 in Big Data Hadoop by Gitika
• 25,340 points
18 views
+1 vote
2 answers

Failed to restart Hadoop namenode using cloudera quickstart

You can use cloudera manager to manage ...READ MORE

answered Mar 19, 2018 in Big Data Hadoop by kurt_cobain
• 9,260 points
767 views
0 votes
1 answer

How can I download hadoop documentation for a specific version?

You can go through this SVN link:- ...READ MORE

answered Mar 21, 2018 in Big Data Hadoop by Shubham
• 13,300 points
87 views
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,690 points
3,063 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,690 points
341 views
0 votes
10 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyFr ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
15,058 views
0 votes
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,260 points
1,121 views
0 votes
1 answer

Can I use Amazon Web Services for free?

If you would like to use AWS ...READ MORE

answered Apr 25 in Big Data Hadoop by ravikiran
• 4,560 points
53 views
0 votes
1 answer

Hadoop Warning: Unable to load Native-Hadoop Library for your platform.

Are you sure you're running Hadoop on 32bit ...READ MORE

answered May 28 in Big Data Hadoop by ravikiran
• 4,560 points
112 views