Can the executor core be greater than the total number of spark tasks?

0 votes
What happens when the executor core be greater than the number of spark tasks? Is this scenario possible? If yes what happens in extra core.
Jun 16 in Apache Spark by Rishi
• 160 points

recategorized Jun 17 by MD 64 views

1 answer to this question.

0 votes
Hi@Rishi,

Yes, it is possible. If executor no. is more than tasks, in that scenario each task gets one executor core. Extra cores are allocated to executors that have no active tasks. They sit ideal. But these executors consume your memory and slow down the process.
answered Jun 17 by MD
• 40,740 points

Related Questions In Apache Spark

0 votes
1 answer
0 votes
1 answer

How to get the number of elements in partition?

rdd.mapPartitions(iter => Array(iter.size).iterator, true) This command will ...READ MORE

answered May 8, 2018 in Apache Spark by kurt_cobain
• 9,310 points
492 views
0 votes
2 answers

In a Spark DataFrame how can I flatten the struct?

// Collect data from input avro file ...READ MORE

answered Jul 4, 2019 in Apache Spark by Dhara dhruve
2,752 views
0 votes
1 answer

How can I compare the elements of the RDD using MapReduce?

You have to use the comparison operator ...READ MORE

answered May 24, 2018 in Apache Spark by Shubham
• 13,380 points
1,040 views
+1 vote
2 answers
0 votes
1 answer

Is it possible to run Apache Spark without Hadoop?

Though Spark and Hadoop were the frameworks designed ...READ MORE

answered May 2, 2019 in Big Data Hadoop by ravikiran
• 4,600 points
216 views
0 votes
1 answer

What do we exactly mean by “Hadoop” – the definition of Hadoop?

The official definition of Apache Hadoop given ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by Shubham
631 views
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,920 points
5,441 views
0 votes
1 answer

Can number of Spark task be greater than the executor core?

Hi@Rishi, Yes, number of spark tasks can be ...READ MORE

answered Jun 17 in Apache Spark by MD
• 40,740 points
67 views