How to use scoop in a Java Program

0 votes
I am a fresher in Big Data Hadoop, Can someone help me how to use Sqoop in a Java Program?
Sep 4 in Big Data Hadoop by nitinrawat895
• 10,730 points
80 views

2 answers to this question.

0 votes

There is a trick which worked out for me pretty well. Via ssh, you can execute the Sqoop command directly. Just you have to use is an SSH Java Library

This is independent of Java. You just need to include any SSH library and scoop installed in the remote system you want to perform the import. Now connect to the system via ssh and execute the commands which will export data from MySQL to hive.

You have to follow this step.

Download sshxcute java library: https://code.google.com/p/sshxcute/ and Add it to the build path of your java project which contains the following Java code

import net.neoremind.sshxcute.core.SSHExec;
import net.neoremind.sshxcute.core.ConnBean;
import net.neoremind.sshxcute.task.CustomTask;
import net.neoremind.sshxcute.task.impl.ExecCommand;

public class TestSSH {

public static void main(String args[]) throws Exception{

    // Initialize a ConnBean object, the parameter list is IP, username, password

    ConnBean cb = new ConnBean("192.168.56.102", "root","hadoop");

    // Put the ConnBean instance as parameter for SSHExec static method getInstance(ConnBean) to retrieve a singleton SSHExec instance
    SSHExec ssh = SSHExec.getInstance(cb);          
    // Connect to server
    ssh.connect();
    CustomTask sampleTask1 = new ExecCommand("echo $SSH_CLIENT"); // Print Your Client IP By which you connected to ssh server on Horton Sandbox
    System.out.println(ssh.exec(sampleTask1));
    CustomTask sampleTask2 = new ExecCommand("sqoop import --connect jdbc:mysql://192.168.56.101:3316/mysql_db_name --username=mysql_user --password=mysql_pwd --table mysql_table_name --hive-import -m 1 -- --schema default");
    ssh.exec(sampleTask2);
    ssh.disconnect();   
}
}
answered Sep 4 by ravikiran
• 4,560 points
0 votes

There is a trick which worked out for me pretty well. Via ssh, you can execute the Sqoop command directly. Just you have to use is an SSH Java Library

This is independent of Java. You just need to include any SSH library and scoop installed in the remote system you want to perform the import. Now connect to the system via ssh and execute the commands which will export data from MySQL to hive.

You have to follow this step.

Download sshxcute java library and Add it to the build path of your java project which contains the following Java code

import net.neoremind.sshxcute.core.SSHExec;
import net.neoremind.sshxcute.core.ConnBean;
import net.neoremind.sshxcute.task.CustomTask;
import net.neoremind.sshxcute.task.impl.ExecCommand;

public class TestSSH {

public static void main(String args[]) throws Exception{

    // Initialize a ConnBean object, the parameter list is IP, username, password

    ConnBean cb = new ConnBean("192.168.56.102", "root","hadoop");

    // Put the ConnBean instance as parameter for SSHExec static method getInstance(ConnBean) to retrieve a singleton SSHExec instance
    SSHExec ssh = SSHExec.getInstance(cb);          
    // Connect to server
    ssh.connect();
    CustomTask sampleTask1 = new ExecCommand("echo $SSH_CLIENT"); // Print Your Client IP By which you connected to ssh server on Horton Sandbox
    System.out.println(ssh.exec(sampleTask1));
    CustomTask sampleTask2 = new ExecCommand("sqoop import --connect jdbc:mysql://192.168.56.101:3316/mysql_db_name --username=mysql_user --password=mysql_pwd --table mysql_table_name --hive-import -m 1 -- --schema default");
    ssh.exec(sampleTask2);
    ssh.disconnect();   
}
}
answered Sep 4 by ravikiran
• 4,560 points

Related Questions In Big Data Hadoop

0 votes
1 answer

How to write a file in hdfs with Java?

You could pass the URI when getting ...READ MORE

answered Sep 26, 2018 in Big Data Hadoop by digger
• 26,550 points
217 views
0 votes
1 answer

How to write a file in HDFS using Java Programming language?

Define the HADOOP_CONF_DIR environment variable to your Hadoop configuration ...READ MORE

answered Sep 28, 2018 in Big Data Hadoop by Frankie
• 9,810 points
309 views
0 votes
1 answer

How to use Hbase shell in a bash script?

Hello, To write scripts with HBase shell it includes non-interactive mode, ...READ MORE

answered May 29 in Big Data Hadoop by Gitika
• 25,360 points
119 views
0 votes
7 answers

How to run a jar file in hadoop?

I used this command to run my ...READ MORE

answered Dec 10, 2018 in Big Data Hadoop by Dasinto
6,123 views
0 votes
1 answer

How to use Sqoop in Java Program?

You can run sqoop from inside your ...READ MORE

answered Nov 19, 2018 in Big Data Hadoop by Neha
• 6,280 points
214 views
0 votes
1 answer
+5 votes
3 answers

How to execute a python file with few arguments in java?

You can use Java Runtime.exec() to run python script, ...READ MORE

answered Mar 27, 2018 in Java by DragonLord999
• 8,380 points

edited Nov 6, 2018 by Omkar 10,402 views
0 votes
1 answer

How to handle drop downs using Selenium WebDriver in Java

First, find an XPath which will return ...READ MORE

answered Mar 27, 2018 in Selenium by nsv999
• 5,110 points
2,301 views
0 votes
1 answer

How to Sqoop in a Java Program?

You can use the following sample code for ...READ MORE

answered Jul 22 in Big Data Hadoop by ravikiran
• 4,560 points
112 views
0 votes
1 answer

How to call MapReduce program using a simple Java Program?

Because map and reduce run on different ...READ MORE

answered Sep 4 in Big Data Hadoop by ravikiran
• 4,560 points
37 views