what are the spark job and spark task and spark staging ?

0 votes
Mar 17 in Apache Spark by satish kumar
• 180 points
233 views

1 answer to this question.

0 votes

In a Spark application, when you invoke an action on RDD, a job is created. Jobs are the main function that has to be done and is submitted to Spark. The jobs are divided into stages depending on how they can be separately carried out (mainly on shuffle boundaries). Then, these stages are divided into tasks. Tasks are the smallest unit of work that has to be done the executor.

answered Mar 18 by Pavan

Related Questions In Apache Spark

0 votes
1 answer

what are the job optimization Technics in spark and scala ?

There are different methods to achieve optimization ...READ MORE

answered Mar 18 in Apache Spark by Veer
206 views
0 votes
1 answer

What are the levels of parallelism in spark streaming ?

> In order to reduce the processing ...READ MORE

answered Jul 26, 2018 in Apache Spark by zombie
• 3,690 points
351 views
0 votes
0 answers

what are the memory issues in spark ?

Mar 17 in Apache Spark by satish kumar
• 180 points
206 views
0 votes
1 answer

What is the difference between persist() and cache() in apache spark?

Hi, persist () allows the user to specify ...READ MORE

answered Jul 3 in Apache Spark by Gitika
• 25,300 points
142 views
0 votes
1 answer

What is Spark UI and how to monitor a spark job?

Hey, Jobs- to view all the spark jobs Stages- ...READ MORE

answered Aug 6 in Apache Spark by Gitika
• 25,300 points
22 views
0 votes
1 answer

What's the difference between 'filter' and 'where' in Spark SQL?

Both 'filter' and 'where' in Spark SQL ...READ MORE

answered May 23, 2018 in Apache Spark by nitinrawat895
• 10,490 points
4,872 views
0 votes
1 answer

What is the difference between Apache Spark SQLContext vs HiveContext?

Spark 2.0+ Spark 2.0 provides native window functions ...READ MORE

answered May 25, 2018 in Apache Spark by nitinrawat895
• 10,490 points
1,788 views
0 votes
1 answer

What are the parameters in local[a,b,c] explains?

SparkContext.createTaskScheduler property parses the master parameter Local: 1 ...READ MORE

answered May 29, 2018 in Apache Spark by Shubham
• 13,290 points
70 views
+1 vote
3 answers

What is the difference between rdd and dataframes in Apache Spark ?

Comparison between Spark RDD vs DataFrame 1. Release ...READ MORE

answered Aug 27, 2018 in Apache Spark by shams
• 3,580 points
11,683 views
0 votes
1 answer

what are the spark real time issues ?

Some of the issues I have faced ...READ MORE

answered Mar 18 in Apache Spark by Sharman
433 views