How to connect to Zookeeper after setting Spark recovery mode

0 votes
I am facing some problem with Spark application. I have enabled Spark recovery and set the mode to Zookeeper. But I don't know why Spark is not getting connected to Zookeeper for connection. How to make the connection?
Mar 25, 2019 in Apache Spark by Snehal
1,445 views

1 answer to this question.

+1 vote

You have set Zookeeper as the recovery mode but I think you forgot to set the URL for Zookeeper to connect to. When you set Zookeeper for recovery, you also need to mention the Zookeeper's URL that the Spark application should connect to when it needs to make the recovery. Use the below command and replace <URL> with your Zookeeper's URL:

val sc = new SparkContext(new SparkConf())

./bin/spark-submit <all your existing options> --spark.deploy.zookeeper.url=<URL>
answered Mar 25, 2019 by Hari

Related Questions In Apache Spark

0 votes
1 answer

How to set Spark recovery to Zookeeper?

There is a property of Spark which ...READ MORE

answered Mar 25, 2019 in Apache Spark by Hari
796 views
0 votes
1 answer

How to change scheduling mode in Spark?

You can change the scheduling mode as ...READ MORE

answered Mar 12, 2019 in Apache Spark by Raj
2,345 views
0 votes
1 answer

How to enable SASL authentication after Spark authentication?

You can do this by setting the ...READ MORE

answered Mar 13, 2019 in Apache Spark by Venu
1,228 views
0 votes
1 answer

How to retain Spark jar and app jar after staging?

By default, Spark jar, app jar, and ...READ MORE

answered Mar 27, 2019 in Apache Spark by Ginni
1,020 views
+1 vote
2 answers
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
11,078 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
2,575 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
109,076 views
0 votes
5 answers

How to change the spark Session configuration in Pyspark?

You aren't actually overwriting anything with this ...READ MORE

answered Dec 14, 2020 in Apache Spark by Gitika
• 65,770 points
125,787 views
+1 vote
8 answers

How to replace null values in Spark DataFrame?

Hi, In Spark, fill() function of DataFrameNaFunctions class is used to replace ...READ MORE

answered Dec 15, 2020 in Apache Spark by MD
• 95,460 points
75,445 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP