Docker swarm scale

0 votes

I have a cluster built with docker-swarm in which we run apache spark apps. In the cluster, I have a manager node and 3 worker nodes.I am scaling the number of worker nodes by using following command.

sudo docker service scale spark_worker=<number of workers>

When I increase the number of worker, I want to create new containers on worker node not on manager node.

Aug 23, 2018 in Docker by Hannah
• 16,710 points
85 views

1 answer to this question.

0 votes

try docker node update --availability=drain <nodename> this will stop containers on the node and start them somewhere else, and prevent future containers from running on that node.

But this has a lot of limitations. If there's a better solution please post it.

answered Aug 23, 2018 by Kalgi
• 42,510 points

Related Questions In Docker

0 votes
1 answer

How do I scale in Docker Swarm Mode W/Terraform Digital Ocean Load Balancing

The solution you could build for Digital ...READ MORE

answered Jun 19, 2018 in Docker by shubham
• 6,890 points
240 views
0 votes
1 answer
0 votes
1 answer

docker swarm throwing an error “swarm already part of swarm”

You run the docker swarm init to create ...READ MORE

answered Aug 22, 2018 in Docker by Nilesh
• 6,880 points
1,090 views
0 votes
1 answer

Docker Swarm Overlay Network Communication

Try to recreate network like this : docker ...READ MORE

answered Aug 22, 2018 in Docker by Nilesh
• 6,880 points
103 views
0 votes
1 answer
+5 votes
7 answers

Docker swarm vs kubernetes

Swarm is easy handling while kn8 is ...READ MORE

answered Aug 27, 2018 in Docker by Mahesh Ajmeria
1,051 views
0 votes
1 answer

Can you use docker-compose files to start services in Docker 1.12 swarm-mode?

Yess, you can. you must download/install the ...READ MORE

answered Aug 10, 2018 in Docker by Kalgi
• 42,510 points
44 views
0 votes
1 answer

Docker-Compose with Docker 1.12 “Swarm Mode”

you can run docker service commands on ...READ MORE

answered Aug 10, 2018 in Docker by Kalgi
• 42,510 points
53 views