How to sync Hadoop configuration files to multiple nodes

0 votes
Can you please give me details about how to sync Hadoop configuration files to multiple nodes?
Jan 9, 2018 in Big Data Hadoop by Sudhir Kumar
1,166 views

1 answer to this question.

0 votes

For syncing Hadoop configuration files, you have to first add the ip address of all the machines to the hosts files of each machine:

192.168.56.101    master
192.168.56.102    slave1
192.168.56.103    slave2

Next you need to establish ssh connection between you master & slave nodes. 
core-site.xml in both master & slave machine 

<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://master:9000</value>
</property>
</configuration>

hdfs-site.xml on master machine:

<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
<property>
<name>dfs.replication</name>
<value>2</value>
</property>
<property>
<name>dfs.permissions</name>
<value>false</value>
</property>
<property>
<name>dfs.namenode.name.dir</name>
<value>/home/edureka/hadoop-2.7.3/namenode</value>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>/home/edureka/hadoop-2.7.3/datanode</value>
</property>
</configuration>

hdfs-site.xml on slave machine

<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
<property>
<name>dfs.replication</name>
<value>2</value>
</property>
<property>
<name>dfs.permissions</name>
<value>false</value>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>/home/edureka/hadoop-2.7.3/datanode</value>
</property>
</configuration>

mapred-site.xml on both master & slave machine:

<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
</configuration>

yarn-site.xml on both master and slave:

<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
<property>
<name>yarn.nodemanager.auxservices.mapreduce.shuffle.class</name>
<value>org.apache.hadoop.mapred.ShuffleHandler</value>
</property>
</configuration>
answered Jun 21, 2018 by HackTheCode

Related Questions In Big Data Hadoop

0 votes
1 answer

How hadoop mapreduce job is submitted to worker nodes?

Alright, I think you are basically looking ...READ MORE

answered Mar 30, 2018 in Big Data Hadoop by Ashish
• 2,650 points
5,227 views
0 votes
1 answer

Is there any way to setup Hadoop nodes (data nodes/namenodes) to use multiple volumes/disks?

Datanodes can store blocks in multiple directories ...READ MORE

answered Jun 20, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
760 views
0 votes
1 answer

How to analyze block placement on datanodes and rebalancing data across Hadoop nodes?

HDFS provides a tool for administrators i.e. ...READ MORE

answered Jun 21, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
818 views
0 votes
1 answer

Hadoop Hive: How to convert multiple rows into comma separated values?

You can use the aggregator function collect_set to do ...READ MORE

answered Nov 14, 2018 in Big Data Hadoop by Omkar
• 69,210 points
4,920 views
0 votes
1 answer

Hadoop Hive: How to split a single row into multiple rows?

Try this SELECT ID1, Sub FROM tableName lateral view ...READ MORE

answered Nov 14, 2018 in Big Data Hadoop by Omkar
• 69,210 points
8,541 views
0 votes
1 answer

Hadoop HDFS: How to delete old files from HDFS?

You can use commands like this: hdfs dfs ...READ MORE

answered Nov 15, 2018 in Big Data Hadoop by Omkar
• 69,210 points
13,346 views
–1 vote
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,390 points
4,234 views
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
10,524 views
0 votes
1 answer

How to get started with Hadoop?

Well, hadoop is actually a framework that ...READ MORE

answered Mar 21, 2018 in Big Data Hadoop by coldcode
• 2,080 points
889 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
103,829 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP