I am getting error while starting the Hadoop Daemons i e ResourceManger NodeManager in Hadoop 3

0 votes

I have installed Hadoop 3(alpha version) in a Pseudo-distributed modeand I am referring to the official Hadoop 3 documentation to do so. Afterwards, when I am trying to execute MapReduce examples the connection is getting refused.

Later when I started all the Hadoop daemons using sbin/start-all.sh script I have noticed some exceptions there.

INFO org.apache.commons.beanutils.FluentPropertyBeanIntrospector: Error when creating PropertyDescriptor for public final void org.apache.commons.configuration2.AbstractConfiguration.setProperty(java.lang.String,java.lang.Object)! Ignoring this property.

DEBUG org.apache.commons.beanutils.FluentPropertyBeanIntrospector: Exception is:

java.beans.IntrospectionException: bad write method arg count: public final void org.apache.commons.configuration2.AbstractConfiguration.setProperty(java.lang.String,java.lang.Object)

    at java.desktop/java.beans.PropertyDescriptor.findPropertyType(PropertyDescriptor.java:696)

    at java.desktop/java.beans.PropertyDescriptor.setWriteMethod(PropertyDescriptor.java:356)

    at java.desktop/java.beans.PropertyDescriptor.<init>(PropertyDescriptor.java:142)

    at org.apache.commons.beanutils.FluentPropertyBeanIntrospector.createFluentPropertyDescritor(FluentPropertyBeanIntrospector.java:178)

    at org.apache.commons.beanutils.FluentPropertyBeanIntrospector.introspect(FluentPropertyBeanIntrospector.java:141)

    at org.apache.commons.beanutils.PropertyUtilsBean.fetchIntrospectionData(PropertyUtilsBean.java:2245)

    at org.apache.commons.beanutils.PropertyUtilsBean.getIntrospectionData(PropertyUtilsBean.java:2226)

    at org.apache.commons.beanutils.PropertyUtilsBean.getPropertyDescriptor(PropertyUtilsBean.java:954)

    at org.apache.commons.beanutils.PropertyUtilsBean.isWriteable(PropertyUtilsBean.java:1478)

    at org.apache.commons.configuration2.beanutils.BeanHelper.isPropertyWriteable(BeanHelper.java:521)

    at org.apache.commons.configuration2.beanutils.BeanHelper.initProperty(BeanHelper.java:357)

    at org.apache.commons.configuration2.beanutils.BeanHelper.initBeanProperties(BeanHelper.java:273)

    at org.apache.commons.configuration2.beanutils.BeanHelper.initBean(BeanHelper.java:192)

    at org.apache.commons.configuration2.beanutils.BeanHelper$BeanCreationContextImpl.initBean(BeanHelper.java:669)

    at org.apache.commons.configuration2.beanutils.DefaultBeanFactory.initBeanInstance(DefaultBeanFactory.java:162)

    at org.apache.commons.configuration2.beanutils.DefaultBeanFactory.createBean(DefaultBeanFactory.java:116)

    at org.apache.commons.configuration2.beanutils.BeanHelper.createBean(BeanHelper.java:459)

    at org.apache.commons.configuration2.beanutils.BeanHelper.createBean(BeanHelper.java:479)

    at org.apache.commons.configuration2.beanutils.BeanHelper.createBean(BeanHelper.java:492)

    at org.apache.commons.configuration2.builder.BasicConfigurationBuilder.createResultInstance(BasicConfigurationBuilder.java:447)

    at org.apache.commons.configuration2.builder.BasicConfigurationBuilder.createResult(BasicConfigurationBuilder.java:417)

    at org.apache.commons.configuration2.builder.BasicConfigurationBuilder.getConfiguration(BasicConfigurationBuilder.java:285)

    at org.apache.hadoop.metrics2.impl.MetricsConfig.loadFirst(MetricsConfig.java:119)

    at org.apache.hadoop.metrics2.impl.MetricsConfig.create(MetricsConfig.java:98)

    at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.configure(MetricsSystemImpl.java:478)

    at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.start(MetricsSystemImpl.java:188)

    at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.init(MetricsSystemImpl.java:163)

    at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.init(DefaultMetricsSystem.java:62)

    at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.initialize(DefaultMetricsSystem.java:58)

    at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$RMActiveServices.serviceInit(ResourceManager.java:678)

    at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)

    at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.createAndInitActiveServices(ResourceManager.java:1129)

    at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceInit(ResourceManager.java:315)

    at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)

    at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.main(ResourceManager.java:1407)


FATAL org.apache.hadoop.yarn.server.resourcemanager.ResourceManager: Error starting ResourceManager

java.lang.ExceptionInInitializerError

    at com.google.inject.internal.cglib.reflect.$FastClassEmitter.<init>(FastClassEmitter.java:67)

    at com.google.inject.internal.cglib.reflect.$FastClass$Generator.generateClass(FastClass.java:72)

    at com.google.inject.internal.cglib.core.$DefaultGeneratorStrategy.generate(DefaultGeneratorStrategy.java:25)

    at com.google.inject.internal.cglib.core.$AbstractClassGenerator.create(AbstractClassGenerator.java:216)

    at com.google.inject.internal.cglib.reflect.$FastClass$Generator.create(FastClass.java:64)

    at com.google.inject.internal.BytecodeGen.newFastClass(BytecodeGen.java:204)

    at com.google.inject.internal.ProviderMethod$FastClassProviderMethod.<init>(ProviderMethod.java:256)

    at com.google.inject.internal.ProviderMethod.create(ProviderMethod.java:71)

    at com.google.inject.internal.ProviderMethodsModule.createProviderMethod(ProviderMethodsModule.java:275)

    at com.google.inject.internal.ProviderMethodsModule.getProviderMethods(ProviderMethodsModule.java:144)

    at com.google.inject.internal.ProviderMethodsModule.configure(ProviderMethodsModule.java:123)

    at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:340)

    at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:349)

    at com.google.inject.AbstractModule.install(AbstractModule.java:122)

    at com.google.inject.servlet.ServletModule.configure(ServletModule.java:52)

    at com.google.inject.AbstractModule.configure(AbstractModule.java:62)

    at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:340)

    at com.google.inject.spi.Elements.getElements(Elements.java:110)

    at com.google.inject.internal.InjectorShell$Builder.build(InjectorShell.java:138)

    at com.google.inject.internal.InternalInjectorCreator.build(InternalInjectorCreator.java:104)

    at com.google.inject.Guice.createInjector(Guice.java:96)

    at com.google.inject.Guice.createInjector(Guice.java:73)

    at com.google.inject.Guice.createInjector(Guice.java:62)

    at org.apache.hadoop.yarn.webapp.WebApps$Builder.build(WebApps.java:332)

    at org.apache.hadoop.yarn.webapp.WebApps$Builder.start(WebApps.java:377)

    at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.startWepApp(ResourceManager.java:1116)

    at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceStart(ResourceManager.java:1218)

    at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)

    at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.main(ResourceManager.java:1408)

Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make protected final java.lang.Class java.lang.ClassLoader.defineClass(java.lang.String,byte[],int,int,java.security.ProtectionDomain) throws java.lang.ClassFormatError accessible: module java.base does not "opens java.lang" to unnamed module @173f73e7

    at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:337)

    at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:281)

    at java.base/java.lang.reflect.Method.checkCanSetAccessible(Method.java:197)

    at java.base/java.lang.reflect.Method.setAccessible(Method.java:191)

    at com.google.inject.internal.cglib.core.$ReflectUtils$2.run(ReflectUtils.java:56)

    at java.base/java.security.AccessController.doPrivileged(Native Method)

    at com.google.inject.internal.cglib.core.$ReflectUtils.<clinit>(ReflectUtils.java:46)

    ... 29 more

My configurations are:

core-site.xml:

<configuration>
    <property>
        <name>fs.default.name</name>
        <value>hdfs://localhost:9000</value>
    </property>
</configuration>

hdfs-site.xml:

<configuration>
    <property>
        <name>dfs.replication</name>
        <value>1</value>
    </property>
</configuration>

mapred-site.xml:

<configuration>
    <property>
        <name>mapreduce.framework.name</name>
        <value>yarn</value>
    </property>
</configuration>

and yarn-site.xml:

<configuration>
    <property>
        <name>yarn.nodemanager.aux-services</name>
        <value>mapreduce_shuffle</value>
    </property>
    <property>
        <name>yarn.nodemanager.env-whitelist</name>
      <value>JAVA_HOME,HADOOP_COMMON_HOME,HADOOP_HDFS_HOME,HADOOP_CONF_DIR,CLASSPATH_PREPEND_DISTCACHE,HADOOP_YARN_HOME,HADOOP_MAPRED_HOME</value>
    </property>
</configuration>

Can someone help me out with the problem?

Apr 15, 2018 in Big Data Hadoop by coldcode
• 2,090 points
3,682 views

1 answer to this question.

0 votes

According to me the error is because you are using Java 9, which is not compatible with Hadoop 3 till now.

Try Changing the version of Java to 8 and I hope your issue will be resolved.

answered Apr 15, 2018 by Shubham
• 13,490 points

Related Questions In Big Data Hadoop

0 votes
2 answers

How can I list NameNode & DataNodes from any machine in the Hadoop cluster?

You can browse hadoop page from any ...READ MORE

answered Jan 23, 2020 in Big Data Hadoop by MD
• 95,460 points
11,676 views
0 votes
1 answer

Why am I not able to see the Hadoop daemons that are running?

I guess you are starting the services ...READ MORE

answered Apr 18, 2018 in Big Data Hadoop by Shubham
• 13,490 points
1,044 views
0 votes
1 answer

Error while starting hadoop daemons

First, format the namenode and then try ...READ MORE

answered Jan 11, 2019 in Big Data Hadoop by Omkar
• 69,220 points
1,650 views
–1 vote
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,350 points
4,643 views
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
11,078 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
109,076 views
0 votes
1 answer
+1 vote
1 answer
0 votes
1 answer

Best way of starting & stopping the Hadoop daemons with command line

First way is to use start-all.sh & ...READ MORE

answered Apr 15, 2018 in Big Data Hadoop by Shubham
• 13,490 points
10,104 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP