Why Java Code in Hadoop uses own Data Types instead of basic Data types?

0 votes
I have just started writing MapReduce codes in Hadoop and I’m not sure why Hadoop uses it’s own data types like BooleanWritable, ByteWritable, IntWritable, LongWritable instead of using basic Java data types. Can anyone help me in understanding this?
Apr 15, 2018 in Big Data Hadoop by Shubham
• 12,710 points
54 views

1 answer to this question.

Your answer

Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
0 votes
Hadoop provides us Writable interface based data types for serialization & de-serialization in MapReduce computation. Using Writables like BooleanWritable, ByteWritable, IntWritable, LongWritable provides efficiency in computation.

They also give us compact representation as they do not store the serialized data format. And it is already known how to de-serialize the data. This is the reason why, we are using Writable interface instead of Java serialization.
answered Apr 15, 2018 by kurt_cobain
• 9,260 points

Related Questions In Big Data Hadoop

0 votes
1 answer

Why is jar file required to execute a MR code instead of class file?

We use hadoop keyword to invoke the ...READ MORE

answered Apr 24, 2018 in Big Data Hadoop by Shubham
• 12,710 points
37 views
0 votes
1 answer

Why Apache Pig is used instead of Hadoop?

As you know writing mapreduce programs in ...READ MORE

answered May 7, 2018 in Big Data Hadoop by Ashish
• 2,630 points
141 views
0 votes
1 answer
+1 vote
1 answer

Is Hadoop only Framework in Big Data Ecosystem ?

Actually there are many other frameworks, one of ...READ MORE

answered Mar 26, 2018 in Big Data Hadoop by Ashish
• 2,630 points
44 views
0 votes
1 answer

Moving files in Hadoop using the Java API?

I would recommend you to use FileSystem.rename(). ...READ MORE

answered Apr 15, 2018 in Big Data Hadoop by Shubham
• 12,710 points
593 views
0 votes
1 answer

Hadoop giving java.io.IOException, in mkdir Java code.

I am not sure about the issue. ...READ MORE

answered May 3, 2018 in Big Data Hadoop by Shubham
• 12,710 points
288 views
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 9,310 points
1,820 views
0 votes
10 answers

hadoop fs -put command?

copy command can be used to copy files ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Sujay
9,002 views
0 votes
1 answer

Why minimum 3 Journal Nodes are required in Hadoop HA architecture?

Initially in Hadoop 1.x, the NameNode was ...READ MORE

answered Apr 20, 2018 in Big Data Hadoop by kurt_cobain
• 9,260 points
1,026 views

© 2018 Brain4ce Education Solutions Pvt. Ltd. All rights Reserved.
"PMP®","PMI®", "PMI-ACP®" and "PMBOK®" are registered marks of the Project Management Institute, Inc. MongoDB®, Mongo and the leaf logo are the registered trademarks of MongoDB, Inc.