How can I run a blacklisted task

0 votes
Hi. One of my tasks has been blacklisted because it caused too many task failures. Now I want to run this task but Spark is stoppoing it from running. How can I run it?
Mar 12, 2019 in Apache Spark by Uma
1,194 views

1 answer to this question.

0 votes

When a task results in too many failures, it is blacklisted and spark prevents it from running again. To run this task, you can disable the property that prevents spark from running the blacklisted tasks. Something like this:

val sc = new SparkContext(new SparkConf())
./bin/spark-submit <all your existing options> --spark.blacklist.enabled=false
answered Mar 12, 2019 by Raj

Related Questions In Apache Spark

0 votes
2 answers

In a Spark DataFrame how can I flatten the struct?

// Collect data from input avro file ...READ MORE

answered Jul 4, 2019 in Apache Spark by Dhara dhruve
6,130 views
+1 vote
1 answer

How can I write a text file in HDFS not from an RDD, in Spark program?

Yes, you can go ahead and write ...READ MORE

answered May 29, 2018 in Apache Spark by Shubham
• 13,490 points
8,489 views
0 votes
1 answer

Can I read a CSV represented as a string into Apache Spark?

You can use the following command. This ...READ MORE

answered May 3, 2018 in Apache Spark by kurt_cobain
• 9,350 points
2,529 views
0 votes
1 answer

How can I compare the elements of the RDD using MapReduce?

You have to use the comparison operator ...READ MORE

answered May 24, 2018 in Apache Spark by Shubham
• 13,490 points
3,459 views
+1 vote
2 answers
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
11,073 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
2,572 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
109,061 views
0 votes
1 answer

How to change scheduling mode in Spark?

You can change the scheduling mode as ...READ MORE

answered Mar 12, 2019 in Apache Spark by Raj
2,344 views
0 votes
1 answer

How to increase Spark listener bus event queue capacity?

The default capacity of listener bus is ...READ MORE

answered Mar 12, 2019 in Apache Spark by Raj
7,250 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP