Why Java Code in Hadoop uses own Data Types instead of basic Data types?

0 votes
I have just started writing MapReduce codes in Hadoop and I’m not sure why Hadoop uses it’s own data types like BooleanWritable, ByteWritable, IntWritable, LongWritable instead of using basic Java data types. Can anyone help me in understanding this?
Apr 15, 2018 in Big Data Hadoop by Shubham
• 13,110 points
58 views

1 answer to this question.

0 votes
Hadoop provides us Writable interface based data types for serialization & de-serialization in MapReduce computation. Using Writables like BooleanWritable, ByteWritable, IntWritable, LongWritable provides efficiency in computation.

They also give us compact representation as they do not store the serialized data format. And it is already known how to de-serialize the data. This is the reason why, we are using Writable interface instead of Java serialization.
answered Apr 15, 2018 by kurt_cobain
• 9,240 points

Related Questions In Big Data Hadoop

0 votes
1 answer

Why is jar file required to execute a MR code instead of class file?

We use hadoop keyword to invoke the ...READ MORE

answered Apr 24, 2018 in Big Data Hadoop by Shubham
• 13,110 points
43 views
0 votes
1 answer

Why Apache Pig is used instead of Hadoop?

As you know writing mapreduce programs in ...READ MORE

answered May 7, 2018 in Big Data Hadoop by Ashish
• 2,630 points
168 views
0 votes
1 answer
+1 vote
1 answer

Is Hadoop only Framework in Big Data Ecosystem ?

Actually there are many other frameworks, one of ...READ MORE

answered Mar 26, 2018 in Big Data Hadoop by Ashish
• 2,630 points
50 views
0 votes
1 answer

Moving files in Hadoop using the Java API?

I would recommend you to use FileSystem.rename(). ...READ MORE

answered Apr 15, 2018 in Big Data Hadoop by Shubham
• 13,110 points
683 views
0 votes
1 answer

Hadoop giving java.io.IOException, in mkdir Java code.

I am not sure about the issue. ...READ MORE

answered May 3, 2018 in Big Data Hadoop by Shubham
• 13,110 points
328 views
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,030 points
2,037 views
0 votes
10 answers

hadoop fs -put command?

copy command can be used to copy files ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Sujay
10,324 views
0 votes
1 answer

Why minimum 3 Journal Nodes are required in Hadoop HA architecture?

Initially in Hadoop 1.x, the NameNode was ...READ MORE

answered Apr 20, 2018 in Big Data Hadoop by kurt_cobain
• 9,240 points
1,179 views