Unable to load data in Hbase table from dataset

+1 vote

I am not able to load data inside Habe table.
Steps :
1- create_namespace 'user_ns'
2- create 'anuj_ns:sales', 'order'
3- load dataset in local 

 curl -o user/DataSet/sales.csv https://raw.githubusercontent.com/bsullins/data/master/salesOrders.csv 

4- remove header row 

sed -i '1d' sales.csv

5- copy data to hdfs

 hdfs dfs -copyFromLocal sales.csv user/DataSet 

6- load data in hbase table 'sales'
 

hbase org.apache.hadoop.hbase.mapreduce.ImportTsv -Dimporttsv.separator=, -Dimporttsv.columns=' 
HBASE_ROW_KEY, 
order:orderID, 
order:orderDate, 
order:shipDate, 
order:shipMode, 
order:profit, 
order:quantity, 
order:sales' user_ns:sales user/DataSet/sales.csv

But the last command is not successfully completed and getting the error.
Dec 18, 2018 in Big Data Hadoop by slayer
• 29,170 points
313 views

1 answer to this question.

0 votes

Try these steps (make necessary changes):

First upload the dataset file in HDFS 

hdfs dfs -put custs   --->ENTER

First we have created a new table in HBase,

Open hbase

hbase shell --->ENTER

Now create a table, you can give your own name, but don't forget to change the table name in the HBase bulk load command,

create 'customerNew','info'   --->ENTER
quit    --->ENTER

Now, execute the following command. please make sure that you are providing your table's name

HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase classpath` hadoop jar /opt/cloudera/parcels/CDH/lib/hbase/hbase-server-1.2.0-cdh5.11.1.jar importtsv -Dimporttsv.separator=, -Dimporttsv.bulk.output=output -Dimporttsv.columns=HBASE_ROW_KEY,info:id,info:fname,info:lname,info:age,info:prof customerNew custs  --->ENTER

Now, execute the following, please write your table's name

HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase classpath` hadoop jar /opt/cloudera/parcels/CDH/lib/hbase/hbase-server-1.2.0-cdh5.11.1.jar completebulkload output customerNew  --->ENTER

Now, open hbase shell again 

habse shell  --->ENTER

And fire the below command, please use your table's name, and you'll be able to see the data loaded into the table.

scan 'customerNew'  --->ENTER

answered Dec 18, 2018 by Omkar
• 67,660 points

Related Questions In Big Data Hadoop

0 votes
1 answer

Which data type is used to store the data in HBase table column?

Hey, Byte Array,  Put p = new Put(Bytes.toBytes("John Smith")); All ...READ MORE

answered May 28 in Big Data Hadoop by Gitika
• 25,340 points
68 views
0 votes
1 answer

How to delete a column family from table in HBase?

Hey, You can delete a column family from ...READ MORE

answered Jun 20 in Big Data Hadoop by Gitika
• 25,340 points
119 views
0 votes
1 answer

How to load data from flat files into different partitions of Hive table?

Hi, You can load data from flat files ...READ MORE

answered Jun 26 in Big Data Hadoop by Gitika
• 25,340 points
53 views
0 votes
1 answer

Hive- unable to load data into table

The command you are typing is incorrect. ...READ MORE

answered Jul 24 in Big Data Hadoop by Firoz
32 views
0 votes
1 answer
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,710 points
3,329 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,710 points
399 views
0 votes
10 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyFr ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
16,456 views
0 votes
1 answer

How to create a parquet table in hive and store data in it from a hive table?

Please use the code attached below for ...READ MORE

answered Jan 28 in Big Data Hadoop by Omkar
• 67,660 points
2,950 views
0 votes
1 answer

Hadoop on OSX “Unable to load realm info from SCDynamicStore”

Add the following to your hadoop-env.sh file: export ...READ MORE

answered Oct 8, 2018 in Big Data Hadoop by Omkar
• 67,660 points
60 views