Prevent jobs to be killed from Web UI

0 votes
HI. I have created a Spark application for my project and created a Web UI for it. The problem is that I am new to Spark and somehow either I or my teammates delete the jobs while experimenting.  Is there any way I can take away the privileges for delete of jobs?
Mar 6 in Apache Spark by Madhu
17 views

1 answer to this question.

0 votes

You need to be careful with this. I am not sure about completely disabling the permission to delete jobs but you can disable deletion of jobs and stages from the Web UI. Use the following property with spark submit and you won't be able to delete the jobs from the Web UI. 

val sc = new SparkContext(new SparkConf())
./bin/spark-submit <all your existing options> --spark.ui.killEnabled=false
answered Mar 6 by Rohit

Related Questions In Apache Spark

0 votes
1 answer

How to prevent executor from self-destructing?

I think there is a timeout set ...READ MORE

answered Mar 12 in Apache Spark by Veer
55 views
0 votes
1 answer

How to check if user has permission in Web UI?

You can implement this as follows: First, add ...READ MORE

answered Mar 14 in Apache Spark by Raj
24 views
0 votes
1 answer

How to add modify access for Web UI user?

For a user to have modification access ...READ MORE

answered Mar 14 in Apache Spark by Raj
40 views
0 votes
1 answer

Efficient way to read specific columns from parquet file in spark

As parquet is a column based storage ...READ MORE

answered Apr 20, 2018 in Apache Spark by kurt_cobain
• 9,260 points
1,419 views
0 votes
1 answer
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,710 points
3,344 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,710 points
399 views
0 votes
10 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyFr ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
16,565 views
0 votes
1 answer

Copy file from local to hdfs from the spark job in yarn mode

Refer to the below code: import org.apache.hadoop.conf.Configuration import org.apache.hadoop.fs.FileSystem import ...READ MORE

answered Jul 24 in Apache Spark by Yogi
222 views
0 votes
1 answer

How to extract record from one RDD using another RDD

Hey, you can use "contains" filter to extract ...READ MORE

answered Aug 23 in Apache Spark by Karan
56 views