How to use Spark jars for Yarn distribution?

0 votes
Hello. I have an archive that has all the jars need for Yarn cache. I want to know how to use this archive for the application. Where should I store it and how to make the application use it?
Mar 28 in Apache Spark by Siri
25 views

1 answer to this question.

Your answer

Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
0 votes

First, store upload this archive to hdfs and note the path to the archive on hdfs. Then open spark shell and run the below command:

val sc = new SparkContext(new SparkConf())

./bin/spark-submit <all your existing options> --spark.yarn.archive=<hdfs path to archive>
answered Mar 28 by Raj

Related Questions In Apache Spark

0 votes
1 answer

How to add third party java jars for use in PySpark?

You can add external jars as arguments ...READ MORE

answered Jul 4, 2018 in Apache Spark by nitinrawat895
• 9,070 points
778 views
0 votes
1 answer

How to use ftp scheme using Yarn in Spark application?

In case Yarn does not support schemes ...READ MORE

answered Mar 28 in Apache Spark by Raj
40 views
0 votes
1 answer
0 votes
1 answer

How to make Spark wait for more time for acknowledgement?

Use the following command to increase the ...READ MORE

answered Mar 11 in Apache Spark by Raj
31 views
0 votes
0 answers
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 9,070 points
1,679 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 9,070 points
132 views
0 votes
10 answers

hadoop fs -put command?

copy command can be used to copy files ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Sujay
8,160 views
0 votes
1 answer

How to set executors for static allocation in Spark Yarn?

Open Spark shell and run the following ...READ MORE

answered Mar 28 in Apache Spark by Raj
62 views
0 votes
1 answer

How to increase Spark memory for execution?

Probably the spill is because you have ...READ MORE

answered Mar 7 in Apache Spark by Pavitra

edited Mar 7 10 views

© 2018 Brain4ce Education Solutions Pvt. Ltd. All rights Reserved.
"PMP®","PMI®", "PMI-ACP®" and "PMBOK®" are registered marks of the Project Management Institute, Inc. MongoDB®, Mongo and the leaf logo are the registered trademarks of MongoDB, Inc.