Why Java Code in Hadoop uses own Data Types instead of basic Data types?

0 votes
I have just started writing MapReduce codes in Hadoop and I’m not sure why Hadoop uses it’s own data types like BooleanWritable, ByteWritable, IntWritable, LongWritable instead of using basic Java data types. Can anyone help me in understanding this?
Apr 15, 2018 in Big Data Hadoop by Shubham
• 13,310 points
104 views

1 answer to this question.

0 votes
Hadoop provides us Writable interface based data types for serialization & de-serialization in MapReduce computation. Using Writables like BooleanWritable, ByteWritable, IntWritable, LongWritable provides efficiency in computation.

They also give us compact representation as they do not store the serialized data format. And it is already known how to de-serialize the data. This is the reason why, we are using Writable interface instead of Java serialization.
answered Apr 15, 2018 by kurt_cobain
• 9,260 points

Related Questions In Big Data Hadoop

0 votes
1 answer

Why is jar file required to execute a MR code instead of class file?

We use hadoop keyword to invoke the ...READ MORE

answered Apr 24, 2018 in Big Data Hadoop by Shubham
• 13,310 points
61 views
0 votes
1 answer

Why Apache Pig is used instead of Hadoop?

As you know writing mapreduce programs in ...READ MORE

answered May 7, 2018 in Big Data Hadoop by Ashish
• 2,630 points
284 views
0 votes
1 answer
0 votes
1 answer

Moving files in Hadoop using the Java API?

I would recommend you to use FileSystem.rename(). ...READ MORE

answered Apr 15, 2018 in Big Data Hadoop by Shubham
• 13,310 points
930 views
0 votes
1 answer

Hadoop giving java.io.IOException, in mkdir Java code.

I am not sure about the issue. ...READ MORE

answered May 3, 2018 in Big Data Hadoop by Shubham
• 13,310 points
486 views
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,730 points
3,379 views
0 votes
10 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyFr ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
16,878 views
0 votes
1 answer

Why minimum 3 Journal Nodes are required in Hadoop HA architecture?

Initially in Hadoop 1.x, the NameNode was ...READ MORE

answered Apr 20, 2018 in Big Data Hadoop by kurt_cobain
• 9,260 points
1,918 views