Prevent jobs to be killed from Web UI

0 votes
HI. I have created a Spark application for my project and created a Web UI for it. The problem is that I am new to Spark and somehow either I or my teammates delete the jobs while experimenting.  Is there any way I can take away the privileges for delete of jobs?
Mar 6 in Apache Spark by Madhu
10 views

1 answer to this question.

0 votes

You need to be careful with this. I am not sure about completely disabling the permission to delete jobs but you can disable deletion of jobs and stages from the Web UI. Use the following property with spark submit and you won't be able to delete the jobs from the Web UI. 

val sc = new SparkContext(new SparkConf())
./bin/spark-submit <all your existing options> --spark.ui.killEnabled=false
answered Mar 6 by Rohit

Related Questions In Apache Spark

0 votes
1 answer

How to prevent executor from self-destructing?

I think there is a timeout set ...READ MORE

answered Mar 12 in Apache Spark by Veer
40 views
0 votes
1 answer

How to check if user has permission in Web UI?

You can implement this as follows: First, add ...READ MORE

answered Mar 14 in Apache Spark by Raj
13 views
0 votes
1 answer

How to add modify access for Web UI user?

For a user to have modification access ...READ MORE

answered Mar 14 in Apache Spark by Raj
31 views
0 votes
1 answer

Efficient way to read specific columns from parquet file in spark

As parquet is a column based storage ...READ MORE

answered Apr 20, 2018 in Apache Spark by kurt_cobain
• 9,240 points
1,112 views
0 votes
1 answer
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,490 points
2,328 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,490 points
237 views
0 votes
10 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyFr ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
11,979 views
0 votes
1 answer

Copy file from local to hdfs from the spark job in yarn mode

Refer to the below code: import org.apache.hadoop.conf.Configuration import org.apache.hadoop.fs.FileSystem import ...READ MORE

answered Jul 24 in Apache Spark by Yogi
28 views
0 votes
1 answer

How can I write a text file in HDFS not from an RDD, in Spark program?

Yes, you can go ahead and write ...READ MORE

answered May 29, 2018 in Apache Spark by Shubham
• 13,290 points
1,013 views