Can number of Spark task be greater than the executor core?

0 votes
What happens when the number of spark tasks is greater than the executor core? How is this scenario handled by Spark?
Jun 16 in Apache Spark by Rishi
• 160 points

edited Jun 17 by MD 67 views

1 answer to this question.

0 votes

Hi@Rishi,

Yes, number of spark tasks can be greater than the executor no. But at that situation, extra task thread is just sitting there in the TIMED_WAITING state.  Each task needs one executor core. When one executor finishes its task, another task is automatically assigned. You can increase your executor no. But it depends on your available memory.  

answered Jun 17 by MD
• 40,740 points

Related Questions In Apache Spark

0 votes
1 answer
0 votes
1 answer

How to get the number of elements in partition?

rdd.mapPartitions(iter => Array(iter.size).iterator, true) This command will ...READ MORE

answered May 8, 2018 in Apache Spark by kurt_cobain
• 9,310 points
492 views
0 votes
2 answers

In a Spark DataFrame how can I flatten the struct?

// Collect data from input avro file ...READ MORE

answered Jul 4, 2019 in Apache Spark by Dhara dhruve
2,752 views
0 votes
1 answer

How can I compare the elements of the RDD using MapReduce?

You have to use the comparison operator ...READ MORE

answered May 24, 2018 in Apache Spark by Shubham
• 13,380 points
1,040 views
+1 vote
2 answers
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,920 points
5,441 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,920 points
800 views
+1 vote
11 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyF ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
33,642 views
0 votes
1 answer

Can the executor core be greater than the total number of spark tasks?

Hi@Rishi, Yes, it is possible. If executor no. ...READ MORE

answered Jun 17 in Apache Spark by MD
• 40,740 points
64 views