Hadoop Map Reduce: java.lang.reflect.InvocationTargetException

+1 vote

I am getting the above mentioned error when I run my code. The code is as follows:

import java.io.IOException;


import org.apache.hadoop.io.IntWritable;

import org.apache.hadoop.io.LongWritable;

import org.apache.hadoop.io.Text;

import org.apache.hadoop.mrunit.mapreduce.MapDriver;

import org.junit.Before;

import org.junit.Test;


public class MyTest {

MapDriver<LongWritable, Text, Text, IntWritable> mapDriver;

@Before

public void setup()

{

MyMapper mr=new MyMapper();

mapDriver=MapDriver.newMapDriver(mr);

System.out.println("mapdriver: "+mapDriver);

}

@Test

public void resultSuccess() throws IOException

{

mapDriver.withInput(new LongWritable(), new Text("655209;1;796764372490213;804422938115889;6"));

mapDriver.withOutput(new Text("6"), new IntWritable(1));

mapDriver.runTest();

}


}



Mapperclass

============


package com.mr;


import java.io.IOException;


import org.apache.hadoop.io.IntWritable;

import org.apache.hadoop.io.LongWritable;

import org.apache.hadoop.io.Text;

import org.apache.hadoop.mapreduce.Mapper;

import org.apache.hadoop.mapreduce.Mapper.Context;


public class MyMapper extends Mapper<LongWritable, Text, Text, IntWritable> {

     Text word = new Text();

IntWritable one = new IntWritable(1);


public void map(LongWritable key, Text value, Context con)

throws IOException, InterruptedException {

String strValu = value.toString();

    String[] words=strValu.split(":");

    if((Integer.parseInt(words[1])==1))

    {

     word.set(words[4]);

     con.write(word, one);

    }

}


}

Dec 17, 2018 in Big Data Hadoop by slayer
• 29,050 points
158 views

1 answer to this question.

0 votes
I executed the same code and it worked. I got no error. Create a new project and add the code there. Execute it from there. Hopefully it should run.
answered Dec 17, 2018 by Omkar
• 67,380 points

Related Questions In Big Data Hadoop

0 votes
1 answer

Map and Reduce task memory settings in Hadoop YARN

It's preferable and generally, it is recommended ...READ MORE

answered Apr 18, 2018 in Big Data Hadoop by kurt_cobain
• 9,240 points
79 views
0 votes
1 answer

What is the best functional language to do Hadoop Map-Reduce?

down voteacceptedBoth Clojure and Haskell are definitely ...READ MORE

answered Sep 4, 2018 in Big Data Hadoop by Frankie
• 9,810 points
40 views
0 votes
1 answer

Hadoop Java Error: java.lang.NoClassDefFoundError: WordCount (wrong name: org/myorg/WordCount)

Hey, try this code import java.io.IOException; import java.util.Iterator; import java.util.StringTokenizer; import ...READ MORE

answered Sep 19, 2018 in Big Data Hadoop by slayer
• 29,050 points
866 views
0 votes
1 answer

Hadoop: java.lang.IllegalArgumentException: Wrong FS: expected: file:///

Try this: Configuration configuration = new Configuration(); FileSystem fs ...READ MORE

answered Dec 12, 2018 in Big Data Hadoop by Omkar
• 67,380 points
560 views
0 votes
1 answer

Moving files in Hadoop using the Java API?

I would recommend you to use FileSystem.rename(). ...READ MORE

answered Apr 15, 2018 in Big Data Hadoop by Shubham
• 13,290 points
795 views
0 votes
1 answer

Hadoop giving java.io.IOException, in mkdir Java code.

I am not sure about the issue. ...READ MORE

answered May 3, 2018 in Big Data Hadoop by Shubham
• 13,290 points
392 views
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,670 points
2,652 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,670 points
275 views
0 votes
1 answer

Hadoop Pig: java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/filter/Filter

This seems like a problem with the ...READ MORE

answered Nov 16, 2018 in Big Data Hadoop by Omkar
• 67,380 points
215 views
0 votes
1 answer

Hadoop: "Caused by: java.lang.ClassNotFoundException" error

Have you placed the jar files in ...READ MORE

answered Nov 19, 2018 in Big Data Hadoop by Omkar
• 67,380 points
907 views