Facing out-of-memory errors in Spark driver

+1 vote

HI. I am new to Spark and I am running a driver job. I am getting out-of-memory errors. It is working for smaller data(I have tried 400MB) but not for larger data (I have tried 1GB, 2GB). Please help

Feb 23, 2019 in Apache Spark by Kishan
1,751 views

1 answer to this question.

0 votes

I am guessing that the configuration set for memory usage for the driver process is less and the memory required is high. By default, it is set to 1g. Try increasing it. 

val sc = new SparkContext(new SparkConf())
./bin/spark-submit <all your existing options> --conf spark.driver.memory=4g
answered Feb 23, 2019 by Rishab

Related Questions In Apache Spark

0 votes
1 answer
0 votes
1 answer

Which is better in term of speed, Shark or Spark?

Spark is a framework for distributed data ...READ MORE

answered Jun 26, 2018 in Apache Spark by nitinrawat895
• 11,380 points
896 views
0 votes
1 answer

In what kind of use cases has Spark outperformed Hadoop in processing?

I can list some but there can ...READ MORE

answered Sep 19, 2018 in Apache Spark by zombie
• 3,790 points
1,061 views
+1 vote
2 answers
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
10,929 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
2,455 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
108,327 views
+1 vote
8 answers

How to print the contents of RDD in Apache Spark?

Save it to a text file: line.saveAsTextFile("alicia.txt") Print contains ...READ MORE

answered Dec 10, 2018 in Apache Spark by Akshay
61,631 views
0 votes
1 answer

What are the levels of parallelism in spark streaming ?

> In order to reduce the processing ...READ MORE

answered Jul 27, 2018 in Apache Spark by zombie
• 3,790 points
4,810 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP