How to move data from Oracle database to Hadoop?

0 votes
I've around 50 GB of data and my requirement is to import that data from Oracle database to Hadoop Distributed File System so that I can do further processing on that data.

So, can anyone tell me is there any way to load data from Oracle DB to HDFS?

I heard about a tool called Sqoop, but I don't have any idea how to use it?

Please help!

Thanks in advance.
Apr 11, 2018 in Big Data Hadoop by Shubham
• 13,110 points
1,473 views

1 answer to this question.

0 votes

Yes, you heard it correctly.

Apache Sqoop is one of the tool used for loading data from Oracle DB to HDFS. But, Sqoop is not just limited to just Oracle DB, you can use it to import and export data to & from most of the relational databases.

Let me tell you first about Sqoop.

Apache Sqoop is a tool designed for efficiently transferring bulk data between Apache Hadoop and external data stores such as relational databases, enterprise data warehouses.

Sqoop is used to import data from external datastores into Hadoop Distributed File System or related Hadoop eco-systems like Hive and HBase. Similarly, Sqoop can also be used to extract data from Hadoop or its eco-systems and export it to external datastores such as relational databases, enterprise data warehouses. Sqoop works with relational databases such as Teradata, Netezza, Oracle, MySQL, Postgres etc.

You can refer the link: https://www.edureka.co/blog/hdfs-using-sqoop/

Follow the steps as mentioned in the blog and you will get to know how Sqoop is used to transfer data from Oracle DB to HDFS.

I hope this will answer your question to some extent.

answered Apr 11, 2018 by nitinrawat895
• 10,030 points

Related Questions In Big Data Hadoop

0 votes
1 answer
0 votes
1 answer

What are the different ways to load data from Hadoop to Azure Data Lake?

I would recommend you to go through ...READ MORE

answered Apr 18, 2018 in Big Data Hadoop by coldcode
• 2,010 points
71 views
0 votes
1 answer

How to import data to HBase from SQL server?

You can easily import the data from ...READ MORE

answered Apr 20, 2018 in Big Data Hadoop by kurt_cobain
• 9,240 points
288 views
0 votes
1 answer

How to implement data locality in Hadoop MapReduce?

You can use this getFileBlockLocations method of ...READ MORE

answered Apr 20, 2018 in Big Data Hadoop by kurt_cobain
• 9,240 points
30 views
0 votes
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,240 points
754 views
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,030 points
2,036 views
0 votes
10 answers

hadoop fs -put command?

copy command can be used to copy files ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Sujay
10,306 views
0 votes
1 answer
0 votes
1 answer

How can we send data from MongoDB to Hadoop?

The MongoDB Connector for Hadoop reads data ...READ MORE

answered Mar 26, 2018 in Big Data Hadoop by nitinrawat895
• 10,030 points
58 views
0 votes
1 answer

How to extract only few lines of data from HDFS?

Here also in case of Hadoop, it is ...READ MORE

answered May 2, 2018 in Big Data Hadoop by nitinrawat895
• 10,030 points
720 views