Error creating VPC when trying to create cluster on AWS

+1 vote

I'm trying to create a cluster on AWS and it's succefully created but when i try to update it using the commans "kops update cluster"
It throws an error saying

Error creating VPC and also says no progress made

Oct 24, 2018 in Kubernetes by Hannah
• 18,520 points

edited Oct 24, 2018 by Hannah 2,363 views
Hey @Hannah, can you post your the error log?
Hey @Kalgi, I've updated my answer with the error log
Can you check how many VPC's have been created in that particular zone?
There are 5 in the zone i'm working on
Ahh, now I get what the problem is. So you're only allowed to have 5 vpcs per zone and it wont create until you delete one of them. So try deleting one of the vpc that you're not using and then create the cluster.
I'll try deleting the vpc not in use and let you know if it works

1 answer to this question.

0 votes
By default AWS only allows 5 vpcs per zone and you already have 5 vpcs. It wouldn't be able to create another one. Either delete an existing vpc not in use or change the zone you're working on. Let me know if it works:)
answered Oct 24, 2018 by Kalgi
• 52,350 points

Related Questions In Kubernetes

0 votes
1 answer

Trying to create Kubernetes cluster inside existing vpc in aws

You can add this ENV variable  export VPC_ID=vpc-YOURID READ MORE

answered Oct 17, 2018 in Kubernetes by Kalgi
• 52,350 points
833 views
0 votes
1 answer

permissions related to AWS ECR

if you add allowContainerRegistry: true, kops will add those permissions ...READ MORE

answered Oct 9, 2018 in Kubernetes by Kalgi
• 52,350 points
1,434 views
0 votes
1 answer
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP