How to import big data public data sets AWS

0 votes

I was trying to use the sample data sets provided by AWS. Problem is they are too big in size and I don't really need that big files. They take a lot of resources and bandwidth. So, is there a best way to import data into AWS to use?

Jan 8, 2019 in AWS by datageek
• 3,090 points
616 views

1 answer to this question.

0 votes

One solution is to create an EBS instance using the Snapshot-ID for the public dataset. By doing so you don't have to pay for the data transfer. Some data sets are available in only in the region, so make sure you register your EC2 instance in the same region.

answered Jan 8, 2019 by Archana
• 5,640 points

Related Questions In AWS

+1 vote
1 answer

How to clear all data from AWS CloudSearch?

Best answer I've been able to find ...READ MORE

answered Jul 20, 2018 in AWS by Priyaj
• 58,090 points
1,952 views
0 votes
2 answers
0 votes
2 answers

how to import numpy and pandas inside aws lambda function?

this approach worked for me: https://medium.com/@melissa_89553/how-to- ...READ MORE

answered Oct 10, 2020 in AWS by anonymous
10,978 views
0 votes
2 answers
0 votes
1 answer
0 votes
1 answer
+1 vote
1 answer

How to make an application private on AWS Elastic Beanstalk?

Like you said by default, your application ...READ MORE

answered Oct 25, 2018 in AWS by Archana
• 5,640 points
1,942 views
0 votes
1 answer

How to format & translate data before visualizing?

Amazon QuickSight lets you prepare data that ...READ MORE

answered Oct 29, 2018 in AWS by Archana
• 5,640 points
1,053 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP