ava sql SQLException Error while processing statement

0 votes

Hi everyone, I am facing one issue and it is weird. My job fails sometime with below error. for example take it as 7 days. The job has success about 5 days and 1 or 2 days it is getting failed with below error. I don't understand why this is happening. Can anyone share their expertise and Thanks in Adv.

19/08/06 14:51:36 INFO utilities.CreateConnection: Creating the connection

java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask

at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:275)

at org.apache.hive.jdbc.HiveStatement.executeUpdate(HiveStatement.java:369)

at com.cisco.mood.processing.utilities.ProcessingDBUtilities.executeQuery(ProcessingDBUtilities.java:56)

at com.cisco.mood.processing.preperation.pre.SRPrePreperation.prepare(SRPrePreperation.java:24)

at com.cisco.mood.processing.process.ProcessData.process(ProcessData.java:58)

at com.cisco.mood.processing.process.MOODProcessingJob.main(MOODProcessingJob.java:44)

19/08/06 14:52:25 INFO process.ProcessData: Exception in MOOD job

19/08/06 14:52:25 INFO utilities.JobMailer: Sending Mail with :MOOD Job failed while processing and export of data(PROD)

19/08/06 14:52:25 INFO utilities.JobMailer: body:MOOD Job failed while processing and export of data.Please look into the issue.

java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask

at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:275)

at org.apache.hive.jdbc.HiveStatement.executeUpdate(HiveStatement.java:369)

at com.cisco.mood.processing.utilities.ProcessingDBUtilities.executeQuery(ProcessingDBUtilities.java:56)

at com.cisco.mood.processing.preperation.pre.SRPrePreperation.prepare(SRPrePreperation.java:24)

at com.cisco.mood.processing.process.ProcessData.process(ProcessData.java:58)

at com.cisco.mood.processing.process.MOODProcessingJob.main(MOODProcessingJob.java:44)

19/08/06 14:52:26 INFO process.MOODProcessingJob: MOOD Job Failed
Aug 7, 2019 in Big Data Hadoop by Hemanth
• 250 points

edited Aug 7, 2019 by Omkar 1,788 views
Hi, @Hemanth

Can you tell what command you are using?
I am not using any command actually. sometimes job has failed due to this error, so if i manually running the .sh file the job is successful
when i gone through other logs i found this error.

INFO hive.HiveImport: Loading uploaded data into Hive

19/08/05 11:31:31 INFO conf.HiveConf: Found configuration file file:/opt/mapr/sqoop/sqoop-1.4.6/conf/hive-site.xml

2019-08-05 11:31:39,939 main ERROR Cannot access RandomAccessFile {}) java.io.FileNotFoundException: /opt/mapr/hive/hive-2.1/logs/phodisvc/hive.log (Permission denied)

2019-08-05 11:31:39,951 main ERROR Unable to invoke factory method in class class org.apache.logging.log4j.core.appender.RollingRandomAccessFileAppender for element RollingRandomAccessFile. java.lang.reflect.InvocationTargetException

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:498)

at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.build(PluginBuilder.java:136)

at org.apache.logging.log4j.core.config.AbstractConfiguration.createPluginObject(AbstractConfiguration.java:813)

at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:753)

at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:745)

at org.apache.logging.log4j.core.config.AbstractConfiguration.doConfigure(AbstractConfiguration.java:389)

at org.apache.logging.log4j.core.config.AbstractConfiguration.initialize(AbstractConfiguration.java:169)

at org.apache.logging.log4j.core.config.builder.impl.DefaultConfigurationBuilder.build(DefaultConfigurationBuilder.java:158)

at org.apache.logging.log4j.core.config.builder.impl.DefaultConfigurationBuilder.build(DefaultConfigurationBuilder.java:43)

at org.apache.logging.log4j.core.config.properties.PropertiesConfigurationFactory.getConfiguration(PropertiesConfigurationFactory.java:149)

at org.apache.logging.log4j.core.config.properties.PropertiesConfigurationFactory.getConfiguration(PropertiesConfigurationFactory.java:46)

at org.apache.logging.log4j.core.config.ConfigurationFactory.getConfiguration(ConfigurationFactory.java:236)

at org.apache.logging.log4j.core.config.ConfigurationFactory$Factory.getConfiguration(ConfigurationFactory.java:445)

at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:228)

at org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:140)

at org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:113)

at org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:98)

at org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:156)

at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jDefault(LogUtils.java:155)

at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jCommon(LogUtils.java:91)

at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jCommon(LogUtils.java:83)

at org.apache.hadoop.hive.common.LogUtils.initHiveLog4j(LogUtils.java:66)

at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:667)

at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:652)

at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:647)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:498)

at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:331)

at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)

at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:515)

at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:606)

at org.apache.sqoop.Sqoop.run(Sqoop.java:143)

at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)

at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)

at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)

at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)

at org.apache.sqoop.Sqoop.main(Sqoop.java:236)

Caused by: java.lang.IllegalStateException: ManagerFactory [org.apache.logging.log4j.core.appender.rolling.RollingRandomAccessFileManager$RollingRandomAccessFileManagerFactory@2a9e754e] unable to create manager for [/opt/mapr/hive/hive-2.1/logs/phodisvc/hive.log] with data [org.apache.logging.log4j.core.appender.rolling.RollingRandomAccessFileManager$FactoryData@760a2b6e]

at org.apache.logging.log4j.core.appender.AbstractManager.getManager(AbstractManager.java:73)

at org.apache.logging.log4j.core.appender.OutputStreamManager.getManager(OutputStreamManager.java:61)

at org.apache.logging.log4j.core.appender.rolling.RollingRandomAccessFileManager.getRollingRandomAccessFileManager(RollingRandomAccessFileManager.java:84)

at org.apache.logging.log4j.core.appender.RollingRandomAccessFileAppender.createAppender(RollingRandomAccessFileAppender.java:206)

... 42 more

2019-08-05 11:31:39,954 main ERROR Null object returned for RollingRandomAccessFile in Appenders.

2019-08-05 11:31:39,954 main ERROR Unable to locate appender "DRFA" for logger config "root"

Hi,@Hemanth

Have you added below properties in Environment variable.

SET hive.exec.dynamic.partition = true;

SET hive.exec.dynamic.partition.mode = nonstrict;

SET hive.auto.convert.join = false;

SET mapreduce.map.memory.mb=8192;

SET mapreduce.reduce.memory.mb=8192;

I am not sure can you please let me know where to add this ?

Hi, @Hemanth

You need to go to hiveconf to add these properties.

No answer to this question. Be the first to respond.

Your answer

Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.

Related Questions In Big Data Hadoop

0 votes
0 answers
+1 vote
1 answer
0 votes
1 answer
0 votes
1 answer

Getting error while building Hadoop core jar using ant.

I think you are missing libtool library. ...READ MORE

answered Apr 18, 2018 in Big Data Hadoop by coldcode
• 2,080 points
766 views
0 votes
1 answer

Error while copying the file from local to HDFS

Well, the reason you are getting such ...READ MORE

answered May 3, 2018 in Big Data Hadoop by Ashish
• 2,650 points
3,676 views
+3 votes
1 answer

Getting Connection Error while loading data into table using cloudera hive

Hey Nafeesa, Itseems that Hive is not able ...READ MORE

answered Oct 4, 2018 in Big Data Hadoop by Vardhan
• 13,190 points
701 views
0 votes
1 answer

What is Metastore in Hive?

It stores metadata for Hive tables (like their schema ...READ MORE

answered Dec 20, 2018 in Big Data Hadoop by Frankie
• 9,830 points
2,862 views
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
10,556 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
2,184 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
104,206 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP