How to increase worker timeout in Spark application

0 votes
Need help with increasing the worker timeout. After a minute the master considers the workers lost. I want to increase this time to 2 minutes. How can I do this?
Mar 25, 2019 in Apache Spark by Suri
3,868 views

1 answer to this question.

0 votes

By default, the timeout is set to 60 seconds. To change it to 2 minutes, you have to set it to 120 seconds and you can do it using the following commands:

val sc = new SparkContext(new SparkConf())

./bin/spark-submit <all your existing options> --spark.worker.timeout=120
answered Mar 25, 2019 by Hari

Related Questions In Apache Spark

0 votes
1 answer

How to increase HDFS replication level in Spark?

Hi @Raunak. You can change the replication ...READ MORE

answered Mar 26, 2019 in Apache Spark by Yash
419 views
0 votes
1 answer

How to launch spark application in cluster mode in Spark?

Hi, To launch spark application in cluster mode, ...READ MORE

answered Aug 2, 2019 in Apache Spark by Gitika
• 65,870 points
2,936 views
0 votes
5 answers

How to change the spark Session configuration in Pyspark?

You aren't actually overwriting anything with this ...READ MORE

answered Dec 13, 2020 in Apache Spark by Gitika
• 65,870 points
49,854 views
0 votes
1 answer

How to convert rdd object to dataframe in spark

SqlContext has a number of createDataFrame methods ...READ MORE

answered May 30, 2018 in Apache Spark by nitinrawat895
• 11,380 points
2,809 views
+1 vote
2 answers
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
6,820 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
1,093 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
47,997 views
0 votes
1 answer

How to enable worker cleanup in Spark?

To enable cleanup, open the spark shell ...READ MORE

answered Mar 25, 2019 in Apache Spark by Hari
817 views
0 votes
1 answer

How to use ftp scheme using Yarn in Spark application?

In case Yarn does not support schemes ...READ MORE

answered Mar 28, 2019 in Apache Spark by Raj
350 views