what are the typicall block sizes in HDFS

0 votes

To see the block size in HDFS , lets see how data get stored inside the Data nodes in Hadoop cluster?

Apr 8 in Big Data Hadoop by sunny
17 views

1 answer to this question.

Your answer

Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
0 votes

HDFS is a block structured file system where each file is divided into particular size and by default the particular size is 128MB. Lets take an example to understand how HDFS stores files and data blocks.

Suppose a person wants to store a file which is of 380MB and want to store it into Hadoop distributed file system. So now what HDFS  will do is that it will divide up the files into three blocks because 380 MB divided by 128MB which is the default size of each data block is approximately three. 

So the first block will occupy 128MB, the second block will also occupy 128MB and the third block will occupy the remaining size of the file that is 124MB. So after the files has been divided into data blocks, this data blocks will be distributed into Data nodes that is present in the Hadoop cluster.

There is a small pictorial form of example given, I hope it will be helpful.

Image result for picture of memory distribution in hadoop as 128mb in blocks

answered Apr 8 by Gitika
• 8,380 points

Related Questions In Big Data Hadoop

0 votes
5 answers
0 votes
1 answer

What are the various ways to import files into HDFS?

There are various tools and frameworks available ...READ MORE

answered Apr 13, 2018 in Big Data Hadoop by nitinrawat895
• 9,070 points
147 views
0 votes
1 answer

What is the command to navigate in HDFS?

First of all there is no command ...READ MORE

answered Apr 27, 2018 in Big Data Hadoop by Shubham
• 12,270 points
278 views
0 votes
1 answer

What is the command to find the free space in HDFS?

You can use dfsadmin which runs a ...READ MORE

answered Apr 29, 2018 in Big Data Hadoop by Shubham
• 12,270 points
78 views
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 9,070 points
1,679 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 9,070 points
132 views
0 votes
10 answers

hadoop fs -put command?

copy command can be used to copy files ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Sujay
8,157 views
0 votes
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,260 points
572 views
0 votes
1 answer

What are the different hdfs dfs commands to perform copy operation?

The different hdfs dfs commands to perform ...READ MORE

answered Apr 9 in Big Data Hadoop by Gitika
• 8,380 points
12 views
0 votes
1 answer

What are the site-specific configuration files in Hadoop?

There are different site specific configuration to ...READ MORE

answered Apr 9 in Big Data Hadoop by Gitika
• 8,380 points
21 views

© 2018 Brain4ce Education Solutions Pvt. Ltd. All rights Reserved.
"PMP®","PMI®", "PMI-ACP®" and "PMBOK®" are registered marks of the Project Management Institute, Inc. MongoDB®, Mongo and the leaf logo are the registered trademarks of MongoDB, Inc.