Hey ,I am install Hadoop and on single node and run on one file in HDFS, but I want to run more of file I work upload files but I know to running all

0 votes
Apr 19 in Big Data Hadoop by حسين
• 200 points

edited Apr 29 by Gitika 50 views

1 answer to this question.

0 votes

If you are talking about running multiple jobs, then you can do it by submitting your jobs to JobClient.runJob() API. 

public static RunningJob runJob(JobConf job) throws IOException

There are different ways of using this, you can refer to this link to know about it.

answered Apr 22 by Tina

Related Questions In Big Data Hadoop

0 votes
1 answer

Hey for all, how to get on large data i want use in hadoop?

Hey! You can get large data-sets for ...READ MORE

answered Apr 24 in Big Data Hadoop by Ariba
32 views
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,070 points
2,040 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,070 points
192 views
0 votes
10 answers

hadoop fs -put command?

copy command can be used to copy files ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Sujay
10,407 views
0 votes
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,240 points
762 views