Deciding number of spark context objects

–1 vote

On what basis should I decide, how many spark context objects are needed to be created while accessing a cluster?

Jan 16, 2019 in Apache Spark by digger
• 26,740 points
475 views

1 answer to this question.

0 votes

How many spark context objects you should create depends on how many jobs you want to run. Only one spark context object is created for every job so in short, number of spark jobs/application is equal to the number of spark context object or vice versa.

answered Jan 16, 2019 by Omkar
• 69,210 points

Related Questions In Apache Spark

0 votes
1 answer
0 votes
1 answer

Increase number of cores in Spark

Now that the job is already running, ...READ MORE

answered Feb 23, 2019 in Apache Spark by Reshma
1,769 views
0 votes
1 answer

Spark Yarn: Changing maximum number of time to submit application

By default, the maximum number of times ...READ MORE

answered Mar 28, 2019 in Apache Spark by Raj
1,655 views
0 votes
1 answer

Can the executor core be greater than the total number of spark tasks?

Hi@Rishi, Yes, it is possible. If executor no. ...READ MORE

answered Jun 17, 2020 in Apache Spark by MD
• 95,440 points
1,808 views
+1 vote
2 answers
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
10,555 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
2,184 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
104,200 views
0 votes
1 answer

Spark shuffle service port number

The default port that shuffle service runs ...READ MORE

answered Mar 1, 2019 in Apache Spark by Omkar
• 69,210 points
622 views
0 votes
1 answer

How to find the number of null contain in dataframe?

Hey there! You can use the select method of the ...READ MORE

answered May 3, 2019 in Apache Spark by Omkar
• 69,210 points
4,619 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP