Password for keystore in Spark

0 votes
I want to add a password to protect my private key in the keystore. The Spark application is up and running. How can I add password to the keystore now?
Mar 15, 2019 in Apache Spark by John
86 views

1 answer to this question.

0 votes

You can dynamically set a password to protect your private key. To set password dynamically, run this command in the Spark shell:

val sc = new SparkContext(new SparkConf())

./bin/spark-submit <all your existing options> --spark.ssl.keyPassword=<password>
answered Mar 15, 2019 by karan

Related Questions In Apache Spark

0 votes
1 answer

Is there an API for implementing graphs in Spark?

GraphX is the Spark API for graphs and ...READ MORE

answered Jan 4, 2019 in Apache Spark by Frankie
• 9,810 points
64 views
0 votes
1 answer

Can I set different protocol for SSL in Spark?

There is no protocol set by default. ...READ MORE

answered Mar 15, 2019 in Apache Spark by Karan
155 views
0 votes
1 answer

How to disable credentials for services in Spark?

Disabling this feature will compromise the security ...READ MORE

answered Mar 15, 2019 in Apache Spark by Karan
130 views
0 votes
1 answer

How to set executors for static allocation in Spark Yarn?

Open Spark shell and run the following ...READ MORE

answered Mar 28, 2019 in Apache Spark by Raj
478 views
+1 vote
2 answers
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,950 points
5,872 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,950 points
881 views
+1 vote
11 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyF ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
37,036 views
0 votes
1 answer

Increase cores for yarn in Spark application

By default, only one core is used for ...READ MORE

answered Mar 26, 2019 in Apache Spark by Bhuvan
227 views
0 votes
1 answer

Invalid syntax in spark

There's a problem with your syntax. There ...READ MORE

answered Jan 31, 2019 in Apache Spark by Omkar
• 69,030 points
242 views