Changing the blacklist time of executor

0 votes
I want to increase the time of one of the executors being blacklisted to prevent it from running. When it is blocked, after some time, it is unblocked again. How can I increase this timing and make it remain blacklisted for more time?
Mar 11, 2019 in Apache Spark by Karan
354 views

1 answer to this question.

0 votes

By default, the node or executor is blacklisted for 1 hour and can not be executed. To change the blacklisting time, run the following command:

val sc = new SparkContext(new SparkConf())
./bin/spark-submit <all your existing options> --spark.blacklist.timeout=<time to blacklist>
answered Mar 11, 2019 by Raj

Related Questions In Apache Spark

0 votes
1 answer

How to increase the amount of data to be transferred to shuffle service at the same time?

The amount of data to be transferred ...READ MORE

answered Mar 1, 2019 in Apache Spark by Omkar
• 69,130 points
254 views
0 votes
1 answer

Spark Yarn: Changing maximum number of time to submit application

By default, the maximum number of times ...READ MORE

answered Mar 28, 2019 in Apache Spark by Raj
820 views
0 votes
1 answer

Can the executor core be greater than the total number of spark tasks?

Hi@Rishi, Yes, it is possible. If executor no. ...READ MORE

answered Jun 17, 2020 in Apache Spark by MD
• 95,160 points
395 views
0 votes
1 answer

Can number of Spark task be greater than the executor core?

Hi@Rishi, Yes, number of spark tasks can be ...READ MORE

answered Jun 17, 2020 in Apache Spark by MD
• 95,160 points
224 views
0 votes
1 answer

How to get the number of elements in partition?

rdd.mapPartitions(iter => Array(iter.size).iterator, true) This command will ...READ MORE

answered May 8, 2018 in Apache Spark by kurt_cobain
• 9,390 points
762 views
+1 vote
2 answers
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
7,196 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
1,165 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
52,650 views
0 votes
1 answer

Increasing retry before blacklisting executor

You can do it like this: val sc ...READ MORE

answered Mar 11, 2019 in Apache Spark by Raj
166 views