How to increase HDFS replication level in Spark?

0 votes
Hi guys. I need some help with Spark HDFS replication level. The level is set to 3 right now by default. I want to know how I can change this and make the replication level as 5. Please help. Thanks
Mar 26 in Apache Spark by Raunak
54 views

1 answer to this question.

0 votes

Hi @Raunak. You can change the replication level as follows:

Open the Spark shell and run the following command:

val sc = new SparkContext(new SparkConf())

./bin/spark-submit <all your existing options> --spark.yarn.submit.file.replication=5
answered Mar 26 by Yash

Related Questions In Apache Spark

0 votes
1 answer

How to increase worker timeout in Spark application?

By default, the timeout is set to ...READ MORE

answered Mar 25 in Apache Spark by Hari
205 views
0 votes
1 answer

How can I write a text file in HDFS not from an RDD, in Spark program?

Yes, you can go ahead and write ...READ MORE

answered May 29, 2018 in Apache Spark by Shubham
• 13,110 points
852 views
0 votes
4 answers

How to change the spark Session configuration in Pyspark?

You can dynamically load properties. First create ...READ MORE

answered Dec 10, 2018 in Apache Spark by Vini
10,280 views
0 votes
1 answer

How to save and retrieve the Spark RDD from HDFS?

You can save the RDD using saveAsObjectFile and saveAsTextFile method. ...READ MORE

answered May 29, 2018 in Apache Spark by Shubham
• 13,110 points
1,433 views
0 votes
0 answers
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,030 points
2,032 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,030 points
190 views
0 votes
10 answers

hadoop fs -put command?

copy command can be used to copy files ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Sujay
10,294 views
0 votes
1 answer

How to retain Spark jar and app jar after staging?

By default, Spark jar, app jar, and ...READ MORE

answered Mar 26 in Apache Spark by Ginni
19 views
0 votes
1 answer

Increase Yarn wait time for Sparkcontext

The default time that the Yarn application waits ...READ MORE

answered Mar 26 in Apache Spark by Rohit
36 views