Docker swarm scale

0 votes

I have a cluster built with docker-swarm in which we run apache spark apps. In the cluster, I have a manager node and 3 worker nodes.I am scaling the number of worker nodes by using following command.

sudo docker service scale spark_worker=<number of workers>

When I increase the number of worker, I want to create new containers on worker node not on manager node.

Aug 23, 2018 in Docker by Hannah
• 14,080 points
59 views

1 answer to this question.

0 votes

try docker node update --availability=drain <nodename> this will stop containers on the node and start them somewhere else, and prevent future containers from running on that node.

But this has a lot of limitations. If there's a better solution please post it.

answered Aug 23, 2018 by Kalgi
• 37,320 points

Related Questions In Docker

0 votes
1 answer

How do I scale in Docker Swarm Mode W/Terraform Digital Ocean Load Balancing

The solution you could build for Digital ...READ MORE

answered Jun 19, 2018 in Docker by shubham
• 6,870 points
164 views
0 votes
1 answer
0 votes
1 answer

docker swarm throwing an error “swarm already part of swarm”

You run the docker swarm init to create ...READ MORE

answered Aug 22, 2018 in Docker by Nilesh
• 6,900 points
621 views
0 votes
1 answer

Docker Swarm Overlay Network Communication

Try to recreate network like this : docker ...READ MORE

answered Aug 22, 2018 in Docker by Nilesh
• 6,900 points
61 views
0 votes
1 answer
+5 votes
7 answers

Docker swarm vs kubernetes

Swarm is easy handling while kn8 is ...READ MORE

answered Aug 27, 2018 in Docker by Mahesh Ajmeria
950 views
0 votes
1 answer

Can you use docker-compose files to start services in Docker 1.12 swarm-mode?

Yess, you can. you must download/install the ...READ MORE

answered Aug 10, 2018 in Docker by Kalgi
• 37,320 points
27 views
0 votes
1 answer

Docker-Compose with Docker 1.12 “Swarm Mode”

you can run docker service commands on ...READ MORE

answered Aug 10, 2018 in Docker by Kalgi
• 37,320 points
34 views