Prevent jobs to be killed from Web UI

0 votes
HI. I have created a Spark application for my project and created a Web UI for it. The problem is that I am new to Spark and somehow either I or my teammates delete the jobs while experimenting.  Is there any way I can take away the privileges for delete of jobs?
Mar 6, 2019 in Apache Spark by Madhu
404 views

1 answer to this question.

0 votes

You need to be careful with this. I am not sure about completely disabling the permission to delete jobs but you can disable deletion of jobs and stages from the Web UI. Use the following property with spark submit and you won't be able to delete the jobs from the Web UI. 

val sc = new SparkContext(new SparkConf())
./bin/spark-submit <all your existing options> --spark.ui.killEnabled=false
answered Mar 6, 2019 by Rohit

Related Questions In Apache Spark

0 votes
1 answer

How to prevent executor from self-destructing?

I think there is a timeout set ...READ MORE

answered Mar 12, 2019 in Apache Spark by Veer
672 views
0 votes
1 answer

How to check if user has permission in Web UI?

You can implement this as follows: First, add ...READ MORE

answered Mar 14, 2019 in Apache Spark by Raj
765 views
0 votes
1 answer

How to add modify access for Web UI user?

For a user to have modification access ...READ MORE

answered Mar 14, 2019 in Apache Spark by Raj
403 views
0 votes
1 answer

Efficient way to read specific columns from parquet file in spark

As parquet is a column based storage ...READ MORE

answered Apr 20, 2018 in Apache Spark by kurt_cobain
• 9,390 points
7,263 views
+1 vote
2 answers
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
10,560 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
2,185 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
104,221 views
0 votes
1 answer

Copy file from local to hdfs from the spark job in yarn mode

Refer to the below code: import org.apache.hadoop.conf.Configuration import org.apache.hadoop.fs.FileSystem import ...READ MORE

answered Jul 24, 2019 in Apache Spark by Yogi
3,393 views
+1 vote
1 answer

How to extract record from one RDD using another RDD

Hey, you can use "contains" filter to extract ...READ MORE

answered Aug 23, 2019 in Apache Spark by Karan
2,132 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP