Kops validate cluster giving an error saying no such host

0 votes
I've created a kubernetes cluster using kops on AWS. I see my master and worker nodes have been created but when I execute the kops validate cluster, I get an error saying no such host.

What is the issue here?
Jan 8, 2019 in Kubernetes by Ali
• 11,360 points
2,283 views

1 answer to this question.

0 votes

One of the possible reasons could be using a Private DNS zone. If you've created a private DNS zone in route 53, you will not be able to access your API from your computer as that would be outside the private VPC. 

You can SSH into the master and try executing the following command:

ssh  -i .ssh/id_rsa admin@ipv4-public-ip-of-master
kubectl get nodes
answered Jan 8, 2019 by Manali

Hi, I'm facing the same issue, when I tried to ssh and log in the master node I get an error

ssh: Could not resolve hostname ********: Name or service not known

Hi@nishad,

By default SSH is working on port 22. So make sure port 22 is enabled in your master node.

Related Questions In Kubernetes

0 votes
1 answer
0 votes
1 answer
0 votes
1 answer

permissions related to AWS ECR

if you add allowContainerRegistry: true, kops will add those permissions ...READ MORE

answered Oct 9, 2018 in Kubernetes by Kalgi
• 52,360 points
890 views
+1 vote
1 answer
0 votes
1 answer

Create LoadBalancer for kubernetes cluster in aws

Hello @Lina, If you're running your cluster on ...READ MORE

answered Oct 8, 2018 in Kubernetes by Kalgi
• 52,360 points
507 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP