Why Java Code in Hadoop uses own Data Types instead of basic Data types

0 votes
I have just started writing MapReduce codes in Hadoop and I’m not sure why Hadoop uses it’s own data types like BooleanWritable, ByteWritable, IntWritable, LongWritable instead of using basic Java data types. Can anyone help me in understanding this?
Apr 15, 2018 in Big Data Hadoop by Shubham
• 13,490 points
1,174 views

1 answer to this question.

0 votes
Hadoop provides us Writable interface based data types for serialization & de-serialization in MapReduce computation. Using Writables like BooleanWritable, ByteWritable, IntWritable, LongWritable provides efficiency in computation.

They also give us compact representation as they do not store the serialized data format. And it is already known how to de-serialize the data. This is the reason why, we are using Writable interface instead of Java serialization.
answered Apr 15, 2018 by kurt_cobain
• 9,350 points

Related Questions In Big Data Hadoop

+1 vote
1 answer

Why is jar file required to execute a MR code instead of class file?

We use hadoop keyword to invoke the ...READ MORE

answered Apr 24, 2018 in Big Data Hadoop by Shubham
• 13,490 points
1,275 views
0 votes
1 answer

Why Apache Pig is used instead of Hadoop?

As you know writing mapreduce programs in ...READ MORE

answered May 8, 2018 in Big Data Hadoop by Ashish
• 2,650 points
2,486 views
0 votes
1 answer

Hadoop: java.io.IOException: File could only be replicated to 0 nodes instead of minReplication (=1)

Try this, first stop all the daemons, ...READ MORE

answered Nov 6, 2018 in Big Data Hadoop by Omkar
• 69,220 points
3,529 views
0 votes
1 answer

Moving files in Hadoop using the Java API?

I would recommend you to use FileSystem.rename(). ...READ MORE

answered Apr 15, 2018 in Big Data Hadoop by Shubham
• 13,490 points
2,693 views
0 votes
1 answer

Hadoop giving java.io.IOException, in mkdir Java code.

I am not sure about the issue. ...READ MORE

answered May 3, 2018 in Big Data Hadoop by Shubham
• 13,490 points
2,568 views
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
11,076 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
109,070 views
+1 vote
1 answer

Why minimum 3 Journal Nodes are required in Hadoop HA architecture?

Initially in Hadoop 1.x, the NameNode was ...READ MORE

answered Apr 20, 2018 in Big Data Hadoop by kurt_cobain
• 9,350 points
11,680 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP