Changing the blacklist time of executor

0 votes
I want to increase the time of one of the executors being blacklisted to prevent it from running. When it is blocked, after some time, it is unblocked again. How can I increase this timing and make it remain blacklisted for more time?
Mar 11, 2019 in Apache Spark by Karan
233 views

1 answer to this question.

0 votes

By default, the node or executor is blacklisted for 1 hour and can not be executed. To change the blacklisting time, run the following command:

val sc = new SparkContext(new SparkConf())
./bin/spark-submit <all your existing options> --spark.blacklist.timeout=<time to blacklist>
answered Mar 11, 2019 by Raj

Related Questions In Apache Spark

0 votes
1 answer

How to increase the amount of data to be transferred to shuffle service at the same time?

The amount of data to be transferred ...READ MORE

answered Mar 1, 2019 in Apache Spark by Omkar
• 69,040 points
145 views
0 votes
1 answer

Spark Yarn: Changing maximum number of time to submit application

By default, the maximum number of times ...READ MORE

answered Mar 28, 2019 in Apache Spark by Raj
554 views
0 votes
1 answer

Can the executor core be greater than the total number of spark tasks?

Hi@Rishi, Yes, it is possible. If executor no. ...READ MORE

answered Jun 17 in Apache Spark by MD
• 41,340 points
66 views
0 votes
1 answer

Can number of Spark task be greater than the executor core?

Hi@Rishi, Yes, number of spark tasks can be ...READ MORE

answered Jun 17 in Apache Spark by MD
• 41,340 points
69 views
0 votes
1 answer

How to get the number of elements in partition?

rdd.mapPartitions(iter => Array(iter.size).iterator, true) This command will ...READ MORE

answered May 8, 2018 in Apache Spark by kurt_cobain
• 9,310 points
495 views
+1 vote
2 answers
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,920 points
5,478 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,920 points
807 views
+1 vote
11 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyF ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
33,961 views
0 votes
1 answer

Increasing retry before blacklisting executor

You can do it like this: val sc ...READ MORE

answered Mar 11, 2019 in Apache Spark by Raj
108 views