Can number of Spark task be greater than the executor core?

0 votes
What happens when the number of spark tasks is greater than the executor core? How is this scenario handled by Spark?
Jun 16 in Apache Spark by Rishi
• 160 points

edited Jun 17 by MD 164 views

1 answer to this question.

0 votes

Hi@Rishi,

Yes, number of spark tasks can be greater than the executor no. But at that situation, extra task thread is just sitting there in the TIMED_WAITING state.  Each task needs one executor core. When one executor finishes its task, another task is automatically assigned. You can increase your executor no. But it depends on your available memory.  

answered Jun 17 by MD
• 79,070 points

Related Questions In Apache Spark

0 votes
1 answer
0 votes
1 answer

7)From Schema RDD, data can be cache by which one of the given choices?

Hi, @Ritu, According to the official documentation of Spark 1.2, ...READ MORE

answered 2 days ago in Apache Spark by Gitika
• 49,300 points
12 views
0 votes
1 answer

What are some of the things you can monitor in the Spark Web UI?

Option c) Mapr Jobs that are submitted READ MORE

answered 17 hours ago in Apache Spark by Gitika
• 49,300 points
11 views
+1 vote
2 answers
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,950 points
6,345 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,950 points
981 views
+1 vote
11 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyF ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
41,796 views
0 votes
1 answer

Can the executor core be greater than the total number of spark tasks?

Hi@Rishi, Yes, it is possible. If executor no. ...READ MORE

answered Jun 17 in Apache Spark by MD
• 79,070 points
231 views
0 votes
1 answer

The number of stages in a job is equal to the number of RDDs in DAG. however, under one of the cgiven conditions, the scheduler can truncate the lineage. identify it.

Hi@Edureka, Spark's internal scheduler may truncate the lineage of the RDD graph ...READ MORE

answered 10 hours ago in Apache Spark by MD
• 79,070 points
13 views