Deciding number of spark context objects

–1 vote

On what basis should I decide, how many spark context objects are needed to be created while accessing a cluster?

Jan 16, 2019 in Apache Spark by digger
• 26,720 points
129 views

1 answer to this question.

0 votes

How many spark context objects you should create depends on how many jobs you want to run. Only one spark context object is created for every job so in short, number of spark jobs/application is equal to the number of spark context object or vice versa.

answered Jan 16, 2019 by Omkar
• 69,150 points

Related Questions In Apache Spark

0 votes
1 answer
0 votes
1 answer

Increase number of cores in Spark

Now that the job is already running, ...READ MORE

answered Feb 22, 2019 in Apache Spark by Reshma
906 views
0 votes
1 answer

Spark Yarn: Changing maximum number of time to submit application

By default, the maximum number of times ...READ MORE

answered Mar 28, 2019 in Apache Spark by Raj
869 views
0 votes
1 answer

Can the executor core be greater than the total number of spark tasks?

Hi@Rishi, Yes, it is possible. If executor no. ...READ MORE

answered Jun 17, 2020 in Apache Spark by MD
• 95,240 points
469 views
+1 vote
2 answers
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
7,557 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
1,251 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
57,489 views
0 votes
1 answer

Spark shuffle service port number

The default port that shuffle service runs ...READ MORE

answered Mar 1, 2019 in Apache Spark by Omkar
• 69,150 points
191 views
0 votes
1 answer

How to find the number of null contain in dataframe?

Hey there! You can use the select method of the ...READ MORE

answered May 3, 2019 in Apache Spark by Omkar
• 69,150 points
2,259 views