How to change the location of Spark event logs?

0 votes

Hi. By default, the event logs are being stored at file:///tmp/spark-events but I want to change it to another location. I am trying to automate something and I need all the logs to be stored in a directory I have created. So, how I can override the default?

Mar 6 in Apache Spark by Sanam
61 views

1 answer to this question.

0 votes

You can change the location where you want to store the logs. I wouldn't suggest you change the whole configuration os Spark. I'd rather suggest you update the location only for the job you want to store the logs in a different location. Try this:

val sc = new SparkContext(new SparkConf())
./bin/spark-submit <all your existing options> --spark.eventLog.dir=<path/to/custom/location>
answered Mar 6 by Rohit

Related Questions In Apache Spark

0 votes
4 answers

How to change the spark Session configuration in Pyspark?

You can dynamically load properties. First create ...READ MORE

answered Dec 10, 2018 in Apache Spark by Vini
10,502 views
0 votes
1 answer
0 votes
1 answer
0 votes
7 answers

How to print the contents of RDD in Apache Spark?

Simple and easy: line.foreach(println) READ MORE

answered Dec 10, 2018 in Apache Spark by Kuber
7,067 views
0 votes
0 answers
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,110 points
2,051 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,110 points
196 views
0 votes
10 answers

hadoop fs -put command?

copy command can be used to copy files ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Sujay
10,490 views
0 votes
1 answer
0 votes
1 answer

Spark event log location

Unless and until you have not changed ...READ MORE

answered Mar 6 in Apache Spark by Rohit
12 views