When running Spark on Yarn, do I need to install Spark on all nodes of Yarn Cluster?

0 votes
I've got a 3 node YARN cluster and I want to run Spark on YARN. Do I need to install Spark on all nodes of yarn Cluster?
Jun 14, 2018 in Apache Spark by Shubham
• 12,890 points
857 views

1 answer to this question.

Your answer

Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
0 votes
No, it is not necessary to install Spark on all the 3 nodes. Since spark runs on top of Yarn, it utilizes yarn for the execution of its commands over the cluster’s nodes.
So, you just have to install Spark on one node.
answered Jun 14, 2018 by nitinrawat895
• 9,490 points

Related Questions In Apache Spark

0 votes
1 answer
0 votes
0 answers

Why doesn't my Spark Yarn client runs on all available worker machines?

I am running an application on Spark ...READ MORE

Feb 22 in Apache Spark by Uzair Ahmad

edited Feb 22 by Omkar 206 views
0 votes
1 answer

Spark Yarn: Changing maximum number of time to submit application

By default, the maximum number of times ...READ MORE

answered Mar 28 in Apache Spark by Raj
59 views
0 votes
1 answer

How to stop messages from being displayed on spark console?

In your log4j.properties file you need to ...READ MORE

answered Apr 24, 2018 in Apache Spark by kurt_cobain
• 9,260 points
731 views
+1 vote
1 answer
0 votes
1 answer

Writing File into HDFS using spark scala

The reason you are not able to ...READ MORE

answered Apr 5, 2018 in Big Data Hadoop by kurt_cobain
• 9,260 points
3,882 views
0 votes
1 answer

Is there any way to check the Spark version?

There are 2 ways to check the ...READ MORE

answered Apr 19, 2018 in Apache Spark by nitinrawat895
• 9,490 points
654 views
0 votes
1 answer

What's the difference between 'filter' and 'where' in Spark SQL?

Both 'filter' and 'where' in Spark SQL ...READ MORE

answered May 23, 2018 in Apache Spark by nitinrawat895
• 9,490 points
3,527 views
0 votes
1 answer

What happens to RDD when one of the nodes goes down?

Whenever a node goes down, Spark knows ...READ MORE

answered Sep 3, 2018 in Apache Spark by nitinrawat895
• 9,490 points
154 views
+1 vote
3 answers

Which cluster type should I choose for Spark?

According to me, start with a standalone ...READ MORE

answered Jun 27, 2018 in Apache Spark by nitinrawat895
• 9,490 points
110 views

© 2018 Brain4ce Education Solutions Pvt. Ltd. All rights Reserved.
"PMP®","PMI®", "PMI-ACP®" and "PMBOK®" are registered marks of the Project Management Institute, Inc. MongoDB®, Mongo and the leaf logo are the registered trademarks of MongoDB, Inc.