How to enable worker cleanup in Spark?

0 votes
I have a Stand-alone Spark application running on my VM. I want to enable the cleanup feature to periodically clean up the worker directories. How to do this?
Mar 25 in Apache Spark by Dinesh
166 views

1 answer to this question.

0 votes

To enable cleanup, open the spark shell and run the below commands:

val sc = new SparkContext(new SparkConf())

./bin/spark-submit <all your existing options> --spark.worker.cleanup.enabled=true

Note that this will work only for stand-alone applications

answered Mar 25 by Hari

Related Questions In Apache Spark

0 votes
1 answer

How to enable dynamic resource allocation in Spark?

To dynamically enable dynamic resource allocation, you ...READ MORE

answered Mar 12 in Apache Spark by veer
89 views
0 votes
1 answer

How to convert rdd object to dataframe in spark

SqlContext has a number of createDataFrame methods ...READ MORE

answered May 30, 2018 in Apache Spark by nitinrawat895
• 10,670 points
1,299 views
0 votes
6 answers
0 votes
1 answer
0 votes
1 answer
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,670 points
2,679 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,670 points
279 views
0 votes
10 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyFr ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
13,292 views
0 votes
1 answer

How to increase worker timeout in Spark application?

By default, the timeout is set to ...READ MORE

answered Mar 25 in Apache Spark by Hari
398 views
0 votes
4 answers

How to change the spark Session configuration in Pyspark?

You can dynamically load properties. First create ...READ MORE

answered Dec 10, 2018 in Apache Spark by Vini
13,295 views