What happens in a MapReduce job when you set the number of reducers to one?

0 votes
I am new to Hadoop and I know that default replication factor of Hadoop is 3 but my question is that what happens to a MapReduce job when we set the number of reducers to one?
Jul 31, 2018 in Big Data Hadoop by Shubham
• 13,290 points
170 views

1 answer to this question.

0 votes
If you set number of reducers as 1 so what happens is that a single reducer gathers and processes all the output from all the mappers. The output is written to a single file in HDFS.

Hope you got the answer.
answered Jul 31, 2018 by nitinrawat895
• 10,670 points

Related Questions In Big Data Hadoop

0 votes
1 answer
0 votes
1 answer

What is the command to count number of lines in a file in hdfs?

hadoop fs -cat /example2/doc1 | wc -l READ MORE

answered Nov 22, 2018 in Big Data Hadoop by Omkar
• 67,480 points
301 views
0 votes
1 answer
0 votes
1 answer

How to limit the number of rows per each item in a Hive QL?

SELECT a_id, b, c, count(*) as sumrequests FROM ...READ MORE

answered Nov 30, 2018 in Big Data Hadoop by Omkar
• 67,480 points
918 views
0 votes
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,240 points
988 views
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,670 points
2,719 views
0 votes
1 answer

How to get started with Hadoop?

Well, hadoop is actually a framework that ...READ MORE

answered Mar 21, 2018 in Big Data Hadoop by coldcode
• 2,020 points
83 views
0 votes
10 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyFr ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
13,490 views
0 votes
1 answer
0 votes
2 answers