Change number of completed drivers displayed

0 votes
Hi. Please help me to change the number of completed drivers to be displayed in the UI. Please tell me what command I should use to change this dynamically?
Mar 25 in Apache Spark by Tina
13 views

1 answer to this question.

0 votes

You can change the number of completed drivers displayed dynamically using the following command:

val sc = new SparkContext(new SparkConf())

./bin/spark-submit <all your existing options> --spark.deploy.retainedDrivers= <number of drivers to be displayed>
answered Mar 25 by Hari

Related Questions In Apache Spark

0 votes
1 answer

Change number of threads use by R back end

Refer to the below commands to know ...READ MORE

answered Mar 19 in Apache Spark by Jai
12 views
0 votes
1 answer

How to get the number of elements in partition?

rdd.mapPartitions(iter => Array(iter.size).iterator, true) This command will ...READ MORE

answered May 8, 2018 in Apache Spark by kurt_cobain
• 9,240 points
109 views
0 votes
1 answer
0 votes
1 answer

Deciding number of spark context objects

How many spark context objects you should ...READ MORE

answered Jan 16 in Apache Spark by Omkar
• 67,120 points
35 views
0 votes
0 answers
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,070 points
2,040 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,070 points
193 views
0 votes
10 answers

hadoop fs -put command?

copy command can be used to copy files ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Sujay
10,409 views
0 votes
1 answer

Increase number of cores in Spark

Now that the job is already running, ...READ MORE

answered Feb 22 in Apache Spark by Reshma
60 views
0 votes
1 answer

Dynamic allocation: Set minimum number of executor

You cans et it dynamically like this:  val ...READ MORE

answered Mar 13 in Apache Spark by Venu
10 views