Nutch with hadoop 2 7 error GC overhead limit exceeded

0 votes

I have a small cluster of 5 systems to crawl few websites from web. Apache nutch 2.3.1 in configured on master. There are 4 workers. Most configuration of hadoop is left default. Each node has total 16 GB memory. While running a job I have observed following error and job was failed.

2019-04-09 23:04:06,732 INFO [main] org.apache.gora.mapreduce.GoraRecordWriter: Flushing the datastore after 5590000 records
2019-04-09 23:07:27,944 ERROR [main] org.apache.hadoop.mapred.YarnChild: Error running child : java.lang.OutOfMemoryError: GC overhead limit exceeded
	at java.util.Arrays.copyOf(Arrays.java:3181)
	at java.util.ArrayList.grow(ArrayList.java:261)
	at java.util.ArrayList.ensureExplicitCapacity(ArrayList.java:235)
	at java.util.ArrayList.ensureCapacityInternal(ArrayList.java:227)
	at java.util.ArrayList.add(ArrayList.java:458)
	at org.apache.hadoop.hbase.client.MultiAction.add(MultiAction.java:76)

Now the question is what and where I should updated heap to get rid of this issue. Is it datanode issue or mapreduce issue. 

Apr 10, 2019 in Big Data Hadoop by Hafiz
• 170 points
1,545 views
Seems like there's a problem in applying the changes. Did you try restarting?
Yes, I tried to restart complete cluster. The problem was finished for few days then it appeared again as questioned.
Can you mention what changes you have made to the configuration?

Hi @ Hafiz. I had the same issue a few months ago. The problem was with the -XX parameter. What parameters have you used for this?

Can you please send the configuration file?

No answer to this question. Be the first to respond.

Your answer

Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.

Related Questions In Big Data Hadoop

0 votes
1 answer

How to install Sqoop with Hadoop 2.2.0?

You can refer the below link to ...READ MORE

answered Sep 18, 2018 in Big Data Hadoop by Frankie
• 9,830 points
1,009 views
0 votes
1 answer

Apache Spark gives "Failed to load native-hadoop with error"

Seems like hadoop path is missing in java.library.path. ...READ MORE

answered Nov 22, 2018 in Big Data Hadoop by Omkar
• 69,230 points
2,649 views
0 votes
1 answer

Datanode not running on Hadoop-2.7.3

Hey, I solved this problem by removing hadoop ...READ MORE

answered Jun 14, 2019 in Big Data Hadoop by Gitika
• 65,890 points
2,038 views
0 votes
1 answer

Moving files in Hadoop using the Java API?

I would recommend you to use FileSystem.rename(). ...READ MORE

answered Apr 15, 2018 in Big Data Hadoop by Shubham
• 13,490 points
2,637 views
0 votes
1 answer

Hadoop giving java.io.IOException, in mkdir Java code.

I am not sure about the issue. ...READ MORE

answered May 3, 2018 in Big Data Hadoop by Shubham
• 13,490 points
2,507 views
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
10,929 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
2,454 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP