Spark job using kubernetes instead of yarn

0 votes

I am writing a spark job which uses kubernetes instead of yarn.

val spark = SparkSession.builder().appName("Demo").master(????).getOrCreate() 

What should the master part be?

Sep 6, 2018 in Kubernetes by lina
• 8,100 points
70 views

1 answer to this question.

Your answer

Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
0 votes
I should be in the format of k8s://https://<k8s-apiserver-host>:<k8s-apiserver-port>

In your case use it in the following way:

val spark = SparkSession.builder().appName("Demo").master(k8s://https://<k8s-apiserver-host>:<k8s-apiserver-port>).getOrCreate()
answered Sep 6, 2018 by Kalgi
• 35,720 points

Related Questions In Kubernetes

0 votes
1 answer
0 votes
1 answer
0 votes
1 answer

deleting pods using kubernetes replication controller

The pods which are managed by ReplicationController ...READ MORE

answered Jul 24, 2018 in Kubernetes by DareDev
• 6,520 points
127 views
0 votes
1 answer
0 votes
3 answers

Error while joining cluster with node

Hi Kalgi after following above steps it ...READ MORE

answered Jan 17 in Others by anonymous
447 views
+3 votes
1 answer
0 votes
2 answers

Change the schedule of Kubernetes cron job

Hey @Lina, You can use the keyword ...READ MORE

answered Sep 18, 2018 in Kubernetes by Hannah
• 14,080 points
131 views
+1 vote
4 answers

© 2018 Brain4ce Education Solutions Pvt. Ltd. All rights Reserved.
"PMP®","PMI®", "PMI-ACP®" and "PMBOK®" are registered marks of the Project Management Institute, Inc. MongoDB®, Mongo and the leaf logo are the registered trademarks of MongoDB, Inc.