Limit displaying completed applications

0 votes
Hi. I want to limit the number of completed applications that are shown in the UI. Right now, I see too many applications. I want to limit is to a lesser value, say 20. How can I do this?
Mar 25 in Apache Spark by Giri
41 views

1 answer to this question.

0 votes

By default, the number of completed applications displayed on the UI is 200 and the rest are dropped. To change this number to display only 20 completed applications, you can use the below commands:

val sc = new SparkContext(new SparkConf())

./bin/spark-submit <all your existing options> --spark.deploy.retainedApplications=20
answered Mar 25 by hari

Related Questions In Apache Spark

0 votes
1 answer

How to stop INFO messages displaying on Spark console?

Just do the following: Edit your conf/log4j.properties file ...READ MORE

answered Aug 21, 2018 in Apache Spark by nitinrawat895
• 10,490 points
304 views
0 votes
1 answer

Getting "buffer limit exceeded" exception inside Kryo.

Seems like the object being sent for ...READ MORE

answered Mar 7 in Apache Spark by Pavitra
46 views
0 votes
1 answer

Change number of completed drivers displayed

You can change the number of completed ...READ MORE

answered Mar 25 in Apache Spark by Hari
15 views
0 votes
1 answer

How to spread applications to less nodes?

You can limit the spread out by ...READ MORE

answered Mar 25 in Apache Spark by Hari
15 views
0 votes
1 answer
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,490 points
2,328 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,490 points
237 views
0 votes
10 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyFr ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
11,977 views
0 votes
1 answer

How to limit the cores being used by a cluster?

You can set the maximum number of ...READ MORE

answered Mar 11 in Apache Spark by Raj
39 views
0 votes
1 answer

Examples of real world big data open source applications

If you are looking for big data ...READ MORE

answered Apr 4, 2018 in Big Data Hadoop by kurt_cobain
• 9,240 points
17 views