touchz io bytes per checksum 512 and blockSize 256 do not match blockSize should be a multiple of io bytes per checksum

0 votes

Hi Guys,

I am new to Hadoop. I want to create a file in hadoop cluster. But it is showing me the below error.

$ hadoop fs  -D dfs.block.size=256 -touchz abcde.txt /
touchz: io.bytes.per.checksum(512) and blockSize(256) do not match. blockSize should be a multiple of io.bytes.per.checksum
Sep 25, 2020 in Big Data Hadoop by akhtar
• 38,170 points
104 views

1 answer to this question.

0 votes

Hi@akhtar,

The default the block size is 64MB in Hadoop. You can create a block size dynamically with the help of dfs.block.size parameter. But the block size should be of 512 bytes or its multiple. But you are trying to create a block size of 256 bytes.

answered Sep 25, 2020 by MD
• 95,060 points

Related Questions In Big Data Hadoop

0 votes
1 answer
0 votes
1 answer

Output types of mapper and reducer does not match

job.setOutputValueClass will set the types expected as ...READ MORE

answered Jul 22, 2019 in Big Data Hadoop by Reena
1,184 views
0 votes
1 answer

How to create a FileSystem object that can be used for reading from and writing to HDFS?

Read operation on HDFS In order to read ...READ MORE

answered Mar 21, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points

edited Mar 21, 2018 by nitinrawat895 1,280 views
0 votes
1 answer

What is the command to find the free space in HDFS?

You can use dfsadmin which runs a ...READ MORE

answered Apr 29, 2018 in Big Data Hadoop by Shubham
• 13,480 points
768 views
0 votes
1 answer

How to find the used cache in HDFS

hdfs dfsadmin -report This command tells fs ...READ MORE

answered May 4, 2018 in Big Data Hadoop by Shubham
• 13,480 points
865 views
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
6,854 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
1,098 views
0 votes
1 answer

java.io.IOException: File /user/hadoop/ could only be replicated to 0 nodes, instead of 1

Hi@akhtar, To avoid this error, follow the bellow ...READ MORE

answered Apr 17, 2020 in Big Data Hadoop by MD
• 95,060 points
256 views