SparkContext.addFile() not able to update file.

0 votes

I want to make changes to a file and I am using the sparkContect.addFile() for it. I am using this method and using the same name as the existing file but the file is not updating. Please help. 

Mar 10 in Apache Spark by Suri
120 views

1 answer to this question.

0 votes

Spark by default won't let you overwrite the file using the sparkContect.addFile() method. To enable the overwrite, you need to make changes to the property:

val sc = new SparkContext(new SparkConf())
./bin/spark-submit <all your existing options> --spark.files.overwrite=true
answered Mar 10 by Siri

Related Questions In Apache Spark

0 votes
1 answer

Not able to use sc in spark shell

Seems like master and worker are not ...READ MORE

answered Jan 3 in Apache Spark by Omkar
• 67,290 points
115 views
0 votes
1 answer
0 votes
1 answer

Not able to preserve shuffle files in Spark

You lose the files because by default, ...READ MORE

answered Feb 23 in Apache Spark by Rana
31 views
0 votes
1 answer
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,490 points
2,306 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,490 points
237 views
0 votes
10 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyFr ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
11,895 views
0 votes
1 answer

Not able to clone Hadoop configuration.

Run the following command in Spark shell ...READ MORE

answered Mar 10 in Apache Spark by Siri
11 views
0 votes
1 answer

How to disable executor from fetching file from cache?

When a Spark application is running, the ...READ MORE

answered Mar 10 in Apache Spark by Siri
50 views