Spark job using kubernetes instead of yarn

0 votes

I am writing a spark job which uses kubernetes instead of yarn.

val spark = SparkSession.builder().appName("Demo").master(????).getOrCreate() 

What should the master part be?

Sep 6, 2018 in Kubernetes by lina
• 8,160 points
148 views

1 answer to this question.

0 votes
I should be in the format of k8s://https://<k8s-apiserver-host>:<k8s-apiserver-port>

In your case use it in the following way:

val spark = SparkSession.builder().appName("Demo").master(k8s://https://<k8s-apiserver-host>:<k8s-apiserver-port>).getOrCreate()
answered Sep 6, 2018 by Kalgi
• 46,110 points

Related Questions In Kubernetes

0 votes
1 answer
0 votes
1 answer
0 votes
1 answer

deleting pods using kubernetes replication controller

The pods which are managed by ReplicationController ...READ MORE

answered Jul 24, 2018 in Kubernetes by DareDev
• 6,810 points
267 views
0 votes
1 answer
0 votes
3 answers

Error while joining cluster with node

Hi Kalgi after following above steps it ...READ MORE

answered Jan 17 in Others by anonymous
3,519 views
+3 votes
1 answer
0 votes
3 answers

Change the schedule of Kubernetes cron job

kubectl patch <backup-cronjob> -p '{"spec":{"schedule": "0 0 ...READ MORE

answered Jun 20 in Kubernetes by sudhams reddy duba
538 views
+1 vote
4 answers