Point data from Hive to Hbase

0 votes
Can someone mention the steps to point data from Hive to Hbase?
Feb 14 in Big Data Hadoop by Karan
34 views

1 answer to this question.

Your answer

Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
0 votes

Refer to the below steps to transfer data from Hive to HBase:

Create the HBase Table:

create 'employee','personaldetails','deptdetails' 

‘personaldetails’ and ‘deptdetails’The above statement will create ‘employee’ with two columns families

Insert the data into HBase table:

hbase(main):049:0> put 'employee','eid01','personaldetails:fname','Brundesh'

0 row(s) in 0.1030 seconds

hbase(main):050:0> put 'employee','eid01','personaldetails:Lname','R'

0 row(s) in 0.0160 seconds

hbase(main):051:0> put 'employee','eid01','personaldetails:salary','10000'

0 row(s) in 0.0090 seconds

hbase(main):060:0> put 'employee','eid01','deptdetails:name','R&D'

0 row(s) in 0.0680 seconds

hbase(main):061:0> put 'employee','eid01','deptdetails:location','Banglore'

0 row(s) in 0.0140 seconds

hbase(main):067:0>  put 'employee','eid02','personaldetails:fname','Abhay'

0 row(s) in 0.0080 seconds

hbase(main):068:0>  put 'employee','eid02','personaldetails:Lname','Kumar'

0 row(s) in 0.0080 seconds

hbase(main):069:0>  put 'employee','eid02','personaldetails:salary','100000'

0 row(s) in 0.0090 seconds

Now create the Hive table pointing to HBase table.

If there are multiple columns family in HBase, we can create one table for each column families. In this case, we have 2 column families and hence we are creating two tables, one for each column families.

Table for personal details column family:

create external table employee_hbase(Eid String, f_name string, s_name string, salary int)

STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'

 with serdeproperties ("hbase.columns.mapping"=":key,personaldetails:fname,personaldetails:Lname,personaldetails:salary")

 tblproperties("hbase.table.name"="employee");

STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'

If we are creating the non-native Hive table using Storage Handler then we should specify the STORED BY clause

Note: There are different classes for different databases

answered Feb 14 by Omkar
• 65,810 points

Related Questions In Big Data Hadoop

0 votes
1 answer

How to import data to HBase from SQL server?

You can easily import the data from ...READ MORE

answered Apr 20, 2018 in Big Data Hadoop by kurt_cobain
• 9,260 points
238 views
0 votes
1 answer

How to create a parquet table in hive and store data in it from a hive table?

Please use the code attached below for ...READ MORE

answered Jan 28 in Big Data Hadoop by Omkar
• 65,810 points
152 views
0 votes
1 answer

Load data from HDFS to hive

You can't directly create a parquet table. ...READ MORE

answered Feb 7 in Big Data Hadoop by Omkar
• 65,810 points
140 views
0 votes
1 answer

How can you transfer data from hive to HDFS ?

Hey, Data in Hive tables reside on HDFS, ...READ MORE

answered 4 days ago in Big Data Hadoop by Gitika
• 6,500 points
8 views
0 votes
1 answer
0 votes
1 answer

What is the Data format and database choices in Hadoop and Spark?

Use Parquet. I'm not sure about CSV ...READ MORE

answered Sep 4, 2018 in Big Data Hadoop by Frankie
• 9,570 points
38 views
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 9,030 points
1,635 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 9,030 points
130 views
0 votes
1 answer

Hadoop Hive Hbase: How to insert data into Hbase using Hive (JSON file)?

You can use the get_json_object function to parse the ...READ MORE

answered Nov 15, 2018 in Big Data Hadoop by Omkar
• 65,810 points
227 views
+1 vote
1 answer

Unable to load data in Hbase table from dataset

Try these steps (make necessary changes): First upload ...READ MORE

answered Dec 18, 2018 in Big Data Hadoop by Omkar
• 65,810 points
168 views

© 2018 Brain4ce Education Solutions Pvt. Ltd. All rights Reserved.
"PMP®","PMI®", "PMI-ACP®" and "PMBOK®" are registered marks of the Project Management Institute, Inc. MongoDB®, Mongo and the leaf logo are the registered trademarks of MongoDB, Inc.