Change number of completed drivers displayed

0 votes
Hi. Please help me to change the number of completed drivers to be displayed in the UI. Please tell me what command I should use to change this dynamically?
Mar 25, 2019 in Apache Spark by Tina
30 views

1 answer to this question.

0 votes

You can change the number of completed drivers displayed dynamically using the following command:

val sc = new SparkContext(new SparkConf())

./bin/spark-submit <all your existing options> --spark.deploy.retainedDrivers= <number of drivers to be displayed>
answered Mar 25, 2019 by Hari

Related Questions In Apache Spark

0 votes
1 answer

Change number of threads use by R back end

Refer to the below commands to know ...READ MORE

answered Mar 19, 2019 in Apache Spark by Jai
39 views
0 votes
1 answer

How to get the number of elements in partition?

rdd.mapPartitions(iter => Array(iter.size).iterator, true) This command will ...READ MORE

answered May 8, 2018 in Apache Spark by kurt_cobain
• 9,290 points
313 views
0 votes
1 answer
–1 vote
1 answer

Deciding number of spark context objects

How many spark context objects you should ...READ MORE

answered Jan 16, 2019 in Apache Spark by Omkar
• 68,880 points
51 views
+1 vote
1 answer
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,840 points
3,920 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,840 points
541 views
+1 vote
11 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyFr ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
20,905 views
0 votes
1 answer

Increase number of cores in Spark

Now that the job is already running, ...READ MORE

answered Feb 22, 2019 in Apache Spark by Reshma
241 views
0 votes
1 answer

Dynamic allocation: Set minimum number of executor

You cans et it dynamically like this:  val ...READ MORE

answered Mar 13, 2019 in Apache Spark by Venu
27 views