How to cleanup application work directories faster

0 votes
Hi. I am running a Spark application on a system that has less disk space. But because of the application work directories, space is getting occupied causing memory problems. How can schedule the cleanup sooner?
Mar 26, 2019 in Apache Spark by Joshi
112 views

1 answer to this question.

0 votes

By default, the cleanup time is set to 604800 (7 days). You can change it as follows:

val sc = new SparkContext(new SparkConf())

./bin/spark-submit <all your existing options> --spark.worker.cleanup.appDataTtl=<Time in seconds>
answered Mar 26, 2019 by Jyoti

Related Questions In Apache Spark

0 votes
1 answer

How to give user only view access for Spark application?

You can give users only view permission ...READ MORE

answered Mar 14, 2019 in Apache Spark by Raj
526 views
0 votes
1 answer

How to disable automatic remove of application of failures?

Yes, you have read it right. The ...READ MORE

answered Mar 25, 2019 in Apache Spark by Hari
175 views
0 votes
1 answer

How to increase worker timeout in Spark application?

By default, the timeout is set to ...READ MORE

answered Mar 25, 2019 in Apache Spark by Hari
5,104 views
0 votes
1 answer

How to enable worker cleanup in Spark?

To enable cleanup, open the spark shell ...READ MORE

answered Mar 25, 2019 in Apache Spark by Hari
1,134 views
+1 vote
2 answers
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
7,893 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
1,328 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
62,543 views
0 votes
1 answer

How to enable SSL for Spark application?

You can do it dynamically like this: val ...READ MORE

answered Mar 15, 2019 in Apache Spark by Karan
808 views
0 votes
1 answer

How to set extra JVM options for Spark application?

You cans set extra JVM options that ...READ MORE

answered Mar 28, 2019 in Apache Spark by Raj
2,512 views