Created cluster on AWS but master node showing not ready

0 votes

Created cluster on AWS but master node showing not ready, Could you please suggest on this .

root@kubemaster:~# kubectl get nodes
NAME STATUS ROLES AGE VERSION
ip-172-31-11-68 NotReady master 12h v1.12.2
ip-172-31-4-65 Ready 11h v1.12.2
Dec 20, 2018 in Kubernetes by Ali
• 10,450 points
33 views

1 answer to this question.

0 votes

This error usually occurs when your cluster hasn't been created. Check the following things and let me know

  • Check if your cluster is created
  • check if your node has joined the cluster
  • and if your cluster is created successfully, check if all your pods are running.
answered Dec 20, 2018 by Eric
Hey @Eric , thanks man, my cluster hadn't been created. i ran the kubectl init command again and joined the node to the cluster. Now it seems fine.

Related Questions In Kubernetes

0 votes
1 answer

DeamonSet pod not scheduling on Master Node

Ever since kubernetes 1.6, DaemonSets are not ...READ MORE

answered Sep 14, 2018 in Kubernetes by DareDev
• 6,810 points
219 views
0 votes
1 answer
0 votes
3 answers

Error while joining cluster with node

Hi Kalgi after following above steps it ...READ MORE

answered Jan 17 in Others by anonymous
1,057 views
+5 votes
7 answers

Docker swarm vs kubernetes

Swarm is easy handling while kn8 is ...READ MORE

answered Aug 27, 2018 in Docker by Mahesh Ajmeria
949 views
0 votes
2 answers

NoSuchBucket error when running Kubernetes on AWS

It was a bug on their part. ...READ MORE

answered Sep 4, 2018 in Kubernetes by Nilesh
• 6,900 points
30 views