How to compress output of the mapreduce output in Hive?

0 votes
Hi. I am running a MapReduce job using Hive and I want to output of the MapReduce job to be compressed. How can I compress the output of the job? Please help
May 20 in Big Data Hadoop by Esha
32 views

1 answer to this question.

0 votes

To compress the output of the MapReduce job, you have to enable the compression feature. You can do this as follows:

$ hive --hiveconf hive.exec.compress.output=true
answered May 20 by Hiran

Related Questions In Big Data Hadoop

0 votes
1 answer

How to see the history of Job output-dir in MapReduce?

Hi, You can use this command to the ...READ MORE

answered Jun 14 in Big Data Hadoop by Gitika
• 25,340 points
38 views
0 votes
1 answer

How to format the output being written by MapReduce in Hadoop?

Here is a simple code demonstrate the ...READ MORE

answered Sep 5, 2018 in Big Data Hadoop by Frankie
• 9,810 points
119 views
0 votes
1 answer

Hadoop Hive: How to skip the first line of csv while loading in hive table?

You can try this: CREATE TABLE temp ...READ MORE

answered Nov 8, 2018 in Big Data Hadoop by Omkar
• 67,600 points
1,009 views
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,690 points
3,025 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,690 points
337 views
0 votes
10 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyFr ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
14,943 views
0 votes
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,260 points
1,109 views
0 votes
1 answer
0 votes
1 answer

What is the command to know the details of your data created in a table in Hive?

Hey, Yes, there is a way to check ...READ MORE

answered May 14 in Big Data Hadoop by Gitika
• 25,340 points
137 views