SQOOP Import error - Invalid argurments

0 votes

$ sqoop import \
> --connect jdbc:mysql://dbserver.edu.cloudlab.com:3306/labuser_database \
> --username edu_labuser \
> --password edureka -m 1  \
> --table dim_store \
> --columns store_nbr,geo_region_cd,store_nm,region_nbr,market_nm,city_nm \
> --where "geo_region_cd = 'US' AND op_cmpny_cd = 'WMT-US'" \
> --hcatalog-database edureka_788309_dw \
> --hcatalog-table dim_store \
> --create-hcatalog-table \
> --hive-partition-keys op_cmpny_cd \
> --hive-partition-values 'WMT-US' \
> --hcatalog-storage-stanza "stored as orcfile";
Warning: /opt/cloudera/parcels/CDH-5.11.1-1.cdh5.11.1.p0.4/bin/../lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
20/08/31 07:20:55 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.11.1
20/08/31 07:20:55 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
20/08/31 07:20:55 ERROR tool.BaseSqoopTool: Error parsing arguments for import:
20/08/31 07:20:55 ERROR tool.BaseSqoopTool: Unrecognized argument: --hive-partition-keys
20/08/31 07:20:55 ERROR tool.BaseSqoopTool: Unrecognized argument: op_cmpny_cd
20/08/31 07:20:55 ERROR tool.BaseSqoopTool: Unrecognized argument: --hive-partition-values
20/08/31 07:20:55 ERROR tool.BaseSqoopTool: Unrecognized argument: WMT-US
20/08/31 07:20:55 ERROR tool.BaseSqoopTool: Unrecognized argument: --hcatalog-storage-stanza
20/08/31 07:20:55 ERROR tool.BaseSqoopTool: Unrecognized argument: stored as orcfile
Try --help for usage instructions.

Aug 31, 2020 in Big Data Hadoop by Vikramraj
• 180 points
195 views

1 answer to this question.

0 votes

Hi@Vikramraj,

You have passed wrong arguments in your command. For example, you passed --hive-partition-keys but it is --hive-partition-key. Likewise, you have to change some arguments. You can check the below document to check the correct arguments.

https://sqoop.apache.org/docs/1.4.2/SqoopUserGuide.html#:~:text=Sqoop%20can%20also%20import%20the,directory%20with%20%2D%2Dtarget%2Ddir%20.

answered Sep 1, 2020 by MD
• 95,300 points

Related Questions In Big Data Hadoop

0 votes
1 answer

Hadoop sqoop import not working. Error: ERROR manager.SqlManager: Error reading from database

In the command, try mentioning the driver ...READ MORE

answered Dec 5, 2018 in Big Data Hadoop by Omkar
• 69,170 points
931 views
0 votes
1 answer

Sqoop import gives error: org.apache.hadoop.security.AccessControlException: Permission denied

Seems like you are not passing the ...READ MORE

answered Dec 17, 2018 in Big Data Hadoop by Omkar
• 69,170 points
1,096 views
0 votes
1 answer

Sqoop import: import failed error

The problem which you are facing is ...READ MORE

answered Feb 11, 2019 in Big Data Hadoop by Omkar
• 69,170 points
1,584 views
0 votes
1 answer

Getting error while using sqoop import.

You are trying to execute the sqoop ...READ MORE

answered Jul 5, 2019 in Big Data Hadoop by Reshma
290 views
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
7,807 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
1,301 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
60,927 views
–1 vote
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,390 points
2,944 views
0 votes
3 answers