Unable to load data in Hbase table from dataset

+1 vote

I am not able to load data inside Habe table.
Steps :
1- create_namespace 'user_ns'
2- create 'anuj_ns:sales', 'order'
3- load dataset in local 

 curl -o user/DataSet/sales.csv https://raw.githubusercontent.com/bsullins/data/master/salesOrders.csv 

4- remove header row 

sed -i '1d' sales.csv

5- copy data to hdfs

 hdfs dfs -copyFromLocal sales.csv user/DataSet 

6- load data in hbase table 'sales'
 

hbase org.apache.hadoop.hbase.mapreduce.ImportTsv -Dimporttsv.separator=, -Dimporttsv.columns=' 
HBASE_ROW_KEY, 
order:orderID, 
order:orderDate, 
order:shipDate, 
order:shipMode, 
order:profit, 
order:quantity, 
order:sales' user_ns:sales user/DataSet/sales.csv

But the last command is not successfully completed and getting the error.
Dec 18, 2018 in Big Data Hadoop by slayer
• 29,040 points
170 views

1 answer to this question.

Your answer

Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
0 votes

Try these steps (make necessary changes):

First upload the dataset file in HDFS 

hdfs dfs -put custs   --->ENTER

First we have created a new table in HBase,

Open hbase

hbase shell --->ENTER

Now create a table, you can give your own name, but don't forget to change the table name in the HBase bulk load command,

create 'customerNew','info'   --->ENTER
quit    --->ENTER

Now, execute the following command. please make sure that you are providing your table's name

HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase classpath` hadoop jar /opt/cloudera/parcels/CDH/lib/hbase/hbase-server-1.2.0-cdh5.11.1.jar importtsv -Dimporttsv.separator=, -Dimporttsv.bulk.output=output -Dimporttsv.columns=HBASE_ROW_KEY,info:id,info:fname,info:lname,info:age,info:prof customerNew custs  --->ENTER

Now, execute the following, please write your table's name

HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase classpath` hadoop jar /opt/cloudera/parcels/CDH/lib/hbase/hbase-server-1.2.0-cdh5.11.1.jar completebulkload output customerNew  --->ENTER

Now, open hbase shell again 

habse shell  --->ENTER

And fire the below command, please use your table's name, and you'll be able to see the data loaded into the table.

scan 'customerNew'  --->ENTER

answered Dec 18, 2018 by Omkar
• 65,840 points

Related Questions In Big Data Hadoop

0 votes
1 answer

Is there any way to load data from MySql to HDFS?

The generic command i.e used to import ...READ MORE

answered Apr 10, 2018 in Big Data Hadoop by nitinrawat895
• 9,030 points
281 views
0 votes
1 answer

What are the different ways to load data from Hadoop to Azure Data Lake?

I would recommend you to go through ...READ MORE

answered Apr 18, 2018 in Big Data Hadoop by coldcode
• 1,980 points
41 views
0 votes
1 answer

How to import data to HBase from SQL server?

You can easily import the data from ...READ MORE

answered Apr 20, 2018 in Big Data Hadoop by kurt_cobain
• 9,260 points
241 views
0 votes
1 answer

How to create smaller table from big table in HIVE?

You could probably best use Hive's built-in sampling ...READ MORE

answered Sep 24, 2018 in Big Data Hadoop by digger
• 27,620 points
67 views
0 votes
1 answer
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 9,030 points
1,644 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 9,030 points
130 views
0 votes
10 answers

hadoop fs -put command?

copy command can be used to copy files ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Sujay
7,976 views
0 votes
1 answer

How to create a parquet table in hive and store data in it from a hive table?

Please use the code attached below for ...READ MORE

answered Jan 28 in Big Data Hadoop by Omkar
• 65,840 points
159 views
0 votes
1 answer

Hadoop on OSX “Unable to load realm info from SCDynamicStore”

Add the following to your hadoop-env.sh file: export ...READ MORE

answered Oct 8, 2018 in Big Data Hadoop by Omkar
• 65,840 points
36 views

© 2018 Brain4ce Education Solutions Pvt. Ltd. All rights Reserved.
"PMP®","PMI®", "PMI-ACP®" and "PMBOK®" are registered marks of the Project Management Institute, Inc. MongoDB®, Mongo and the leaf logo are the registered trademarks of MongoDB, Inc.