How to set client authentication for SSL

0 votes
I am running a Spark application and have enabled SSL that clients can connect. I want to the clients who are trying to connect to be authenticated before the connection is completed. How can I do this?
Mar 15, 2019 in Apache Spark by Sanjeev
780 views

1 answer to this question.

0 votes

By default, this feature is disabled. To enable client authentication, set the authentication property to true. Try this: 

val sc = new SparkContext(new SparkConf())

./bin/spark-submit <all your existing options> --spark.ssl.needClientAuth=true
answered Mar 15, 2019 by Karan

Related Questions In Apache Spark

0 votes
1 answer
0 votes
1 answer

How to set max executors for dynamic allocation?

You can set it by assigning the ...READ MORE

answered Mar 13, 2019 in Apache Spark by Venu
1,612 views
0 votes
1 answer

How to enable SSL for Spark application?

You can do it dynamically like this: val ...READ MORE

answered Mar 15, 2019 in Apache Spark by Karan
2,218 views
0 votes
1 answer

How to set maximum receiving rate for backpressure mechanism?

You can set the maximum receiving rate ...READ MORE

answered Mar 18, 2019 in Apache Spark by John
411 views
+1 vote
2 answers
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
10,618 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
2,215 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
104,944 views
0 votes
1 answer

How to set time for task speculation?

By default, the check for task speculation ...READ MORE

answered Mar 12, 2019 in Apache Spark by Veer
377 views
0 votes
1 answer

How to set cpu cores for spark task?

By default, each task is allocated with ...READ MORE

answered Mar 12, 2019 in Apache Spark by Veer
4,147 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP