which version of sqoop i install for hadoop3 1 1

+1 vote
Feb 7, 2019 in Big Data Hadoop by lucky
• 130 points
5,688 views
Bro. Where did you get 3.1.1? I downloaded from Official site but did not work. Please share link.
hi @vishal

download any versions of hadoop
does sqoop 1.4.7 works with hadoop 3.3.0

Hi@shresht,

I don't know the combination you have asked will work or not. For that, you have to implement it manually in your system and have to check. But you can use Sqoop 1.4.7 with Hadoop 2.6.0 as you can check from the below official link.

 

it is not working , it gives error when i check sqoop version:" path not specified"

I followed this link for installing sqoop on windows: https://www.edureka.co/community/39180/need-help-installing-sqoop-on-windows

Hi@shresht,

Did you configure Hadoop in your system? Sqoop will only work if you have Hadoop in your system.

1 answer to this question.

+1 vote
Hi @lucky!

The latest version of Sqoop available is 1.99.7 but it is not stable. For Hadoop 3.1.1., I suggest you install Sqoop 1.4.7 which is a stable release.
answered Feb 7, 2019 by Omkar
• 69,220 points
thanx @omkar

sqoop 1.4.7 work with hadoop 3.1.1 ? i install it but i get errors
Hey @lucky, could you post the error here? It'll be easier for me to help you.
my hbase is 2.0.4 and sqoop 1.4.7

sqoop import --connect jdbc:mysql://localhost/sqoopdb --username huser --password hive --table customer --hbase-table hcust1 --column-family cf1
Warning: /usr/local/sqoop/sqoop-1.4.7/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /usr/local/sqoop/sqoop-1.4.7/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
/usr/local/hadoop-3.1.1/libexec/hadoop-functions.sh: line 2358: HADOOP_ORG.APACHE.SQOOP.SQOOP_USER: bad substitution
/usr/local/hadoop-3.1.1/libexec/hadoop-functions.sh: line 2453: HADOOP_ORG.APACHE.SQOOP.SQOOP_OPTS: bad substitution
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hadoop-3.1.1/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hbase/hbase-2.0.4/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
2019-03-01 06:35:48,802 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
2019-03-01 06:35:49,009 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
2019-03-01 06:35:49,784 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
2019-03-01 06:35:49,785 INFO tool.CodeGenTool: Beginning code generation
Fri Mar 01 06:35:51 IST 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
2019-03-01 06:35:52,139 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `customer` AS t LIMIT 1
2019-03-01 06:35:52,519 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `customer` AS t LIMIT 1
2019-03-01 06:35:52,572 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/local/hadoop-3.1.1
Note: /tmp/sqoop-lx5/compile/92ec9728bba8ab54aea4d38e4d12b55c/customer.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
2019-03-01 06:35:59,037 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-lx5/compile/92ec9728bba8ab54aea4d38e4d12b55c/customer.jar
2019-03-01 06:35:59,615 WARN manager.MySQLManager: It looks like you are importing from mysql.
2019-03-01 06:35:59,615 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
2019-03-01 06:35:59,615 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
2019-03-01 06:35:59,616 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
2019-03-01 06:36:00,159 INFO mapreduce.ImportJobBase: Beginning import of customer
2019-03-01 06:36:00,167 INFO Configuration.deprecation: mapred.job.tracker is deprecated. Instead, use mapreduce.jobtracker.address
2019-03-01 06:36:00,998 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
Fri Mar 01 06:36:01 IST 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
2019-03-01 06:36:01,217 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.hbase.client.HBaseAdmin.<init>(Lorg/apache/hadoop/conf/Configuration;)V
    at org.apache.sqoop.mapreduce.HBaseImportJob.jobSetup(HBaseImportJob.java:163)
    at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:268)
    at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:692)
    at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:127)
    at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:520)
    at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
    at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
    at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
    at org.apache.sqoop.Sqoop.main(Sqoop.java:252)

You should use the same version of HBase for both compiling and running jar. I think you have made a mistake there. 

Related Questions In Big Data Hadoop

0 votes
1 answer

which version of sqoop should i use for hadoop 3.3.0?

Hi@shresht, This is just a warning. Did you ...READ MORE

answered Sep 7, 2020 in Big Data Hadoop by MD
• 95,460 points
2,337 views
0 votes
0 answers

which version of sqoop should i use with hadoop 3.3.0 ?

I am trying to install sqoop 1.4.7 ...READ MORE

Sep 6, 2020 in Big Data Hadoop by shresht
• 140 points

closed Sep 7, 2020 by Gitika 1,178 views
0 votes
1 answer

How can I download hadoop documentation for a specific version?

You can go through this SVN link:- ...READ MORE

answered Mar 22, 2018 in Big Data Hadoop by Shubham
• 13,490 points
933 views
0 votes
1 answer

I want to install snappy on Hadoop 1.2.1. How do I do that?

As per Cloudera, if you install hadoop ...READ MORE

answered Dec 11, 2018 in Big Data Hadoop by Frankie
• 9,830 points
1,118 views
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
11,078 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
2,575 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
109,081 views
–1 vote
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,350 points
4,644 views
+2 votes
1 answer

Hadoop Jersey 1.x injection which is now not supported by Weblogic 12C

Add these lines to your weblogic-application.xml file ...READ MORE

answered Oct 31, 2018 in Big Data Hadoop by Omkar
• 69,220 points
1,359 views
0 votes
1 answer

Hadoop: java.io.IOException: File could only be replicated to 0 nodes instead of minReplication (=1)

Try this, first stop all the daemons, ...READ MORE

answered Nov 6, 2018 in Big Data Hadoop by Omkar
• 69,220 points
3,530 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP