How to set maximum receiving rate for backpressure mechanism?

0 votes
I have enabled the backpressure mechanism for my Spark application. I want to set a maximum rate of receiving the first batch data. How can I do this?
Mar 18, 2019 in Apache Spark by Yogi
68 views

1 answer to this question.

0 votes

You can set the maximum receiving rate by using the following commands in the spark shell:

val sc = new SparkContext(new SparkConf())

./bin/spark-submit <all your existing options> --spark.streaming.backpressure.initialRate=<max receiving rate>
answered Mar 18, 2019 by John

Related Questions In Apache Spark

0 votes
1 answer
0 votes
1 answer

How to set max executors for dynamic allocation?

You can set it by assigning the ...READ MORE

answered Mar 13, 2019 in Apache Spark by Venu
588 views
0 votes
1 answer

How to set client authentication for SSL?

By default, this feature is disabled. To ...READ MORE

answered Mar 15, 2019 in Apache Spark by Karan
111 views
0 votes
1 answer

How to set executors for static allocation in Spark Yarn?

Open Spark shell and run the following ...READ MORE

answered Mar 28, 2019 in Apache Spark by Raj
478 views
+1 vote
2 answers
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,950 points
5,876 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,950 points
882 views
+1 vote
11 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyF ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
37,072 views
0 votes
1 answer

How to set time for task speculation?

By default, the check for task speculation ...READ MORE

answered Mar 12, 2019 in Apache Spark by Veer
56 views
0 votes
1 answer

How to set cpu cores for spark task?

By default, each task is allocated with ...READ MORE

answered Mar 12, 2019 in Apache Spark by Veer
1,663 views