How to increase HDFS replication level in Spark

0 votes
Hi guys. I need some help with Spark HDFS replication level. The level is set to 3 right now by default. I want to know how I can change this and make the replication level as 5. Please help. Thanks
Mar 27, 2019 in Apache Spark by Raunak
1,550 views

1 answer to this question.

0 votes

Hi @Raunak. You can change the replication level as follows:

Open the Spark shell and run the following command:

val sc = new SparkContext(new SparkConf())

./bin/spark-submit <all your existing options> --spark.yarn.submit.file.replication=5
answered Mar 27, 2019 by Yash

Related Questions In Apache Spark

0 votes
1 answer

How to increase worker timeout in Spark application?

By default, the timeout is set to ...READ MORE

answered Mar 25, 2019 in Apache Spark by Hari
9,092 views
+1 vote
1 answer

How can I write a text file in HDFS not from an RDD, in Spark program?

Yes, you can go ahead and write ...READ MORE

answered May 29, 2018 in Apache Spark by Shubham
• 13,490 points
8,496 views
0 votes
5 answers

How to change the spark Session configuration in Pyspark?

You aren't actually overwriting anything with this ...READ MORE

answered Dec 14, 2020 in Apache Spark by Gitika
• 65,770 points
125,787 views
0 votes
1 answer

How to save and retrieve the Spark RDD from HDFS?

You can save the RDD using saveAsObjectFile and saveAsTextFile method. ...READ MORE

answered May 29, 2018 in Apache Spark by Shubham
• 13,490 points
13,578 views
+1 vote
2 answers
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
11,078 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
2,575 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
109,076 views
0 votes
1 answer

How to retain Spark jar and app jar after staging?

By default, Spark jar, app jar, and ...READ MORE

answered Mar 27, 2019 in Apache Spark by Ginni
1,020 views
0 votes
1 answer

Increase Yarn wait time for Sparkcontext

The default time that the Yarn application waits ...READ MORE

answered Mar 27, 2019 in Apache Spark by Rohit
2,698 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP