Change number of threads use by R back end

0 votes
I have a Spark application running that uses R back end to handle RPC calls. I want to increase the number of threads being used by it. How can I do it?
Mar 19 in Apache Spark by Sagar
23 views

1 answer to this question.

0 votes

Refer to the below commands to know how to change the number of threads:

val sc = new SparkContext(new SparkConf())

./bin/spark-submit <all your existing options> --spark.r.numRBackendThreads= <number of threads>
answered Mar 19 by Jai

Related Questions In Apache Spark

0 votes
1 answer

Change number of completed drivers displayed

You can change the number of completed ...READ MORE

answered Mar 25 in Apache Spark by Hari
19 views
0 votes
1 answer

How to get the number of elements in partition?

rdd.mapPartitions(iter => Array(iter.size).iterator, true) This command will ...READ MORE

answered May 8, 2018 in Apache Spark by kurt_cobain
• 9,260 points
211 views
0 votes
1 answer
0 votes
1 answer

In what kind of use cases has Spark outperformed Hadoop in processing?

I can list some but there can ...READ MORE

answered Sep 19, 2018 in Apache Spark by zombie
• 3,690 points
69 views
0 votes
1 answer
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,690 points
3,019 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,690 points
337 views
0 votes
10 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyFr ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
14,897 views
0 votes
1 answer

How to change the location of Spark event logs?

You can change the location where you ...READ MORE

answered Mar 6 in Apache Spark by Rohit
189 views
0 votes
1 answer

Using R to display configuration of Spark SQL

Try the below-mentioned code. sparkR.session() properties <- sql("SET -v") showDF(properties, ...READ MORE

answered Mar 18 in Apache Spark by John
20 views