Why we require SSH during hadoop installation

0 votes
Jun 11, 2019 in Big Data Hadoop by Lovish
• 130 points

retagged Jun 11, 2019 by Gitika 4,991 views

2 answers to this question.

+2 votes

Hey,

SSH setup is required to do different operations on a cluster such as starting, stopping, distributed daemon shell operations.

Hadoop core requires shell i.e, (SSH) to communicate with slave nodes and to create the process on to the slave nodes. The communication will be frequent when the cluster is live and working in a fully distributed environment.

For our single-node setup of Hadoop, we, therefore need to configure SSH access to localhost for the Hadoop user.

answered Jun 11, 2019 by Gitika
• 65,910 points

edited Jun 11, 2019 by Gitika
0 votes
When a Hadoop cluster is built, there are slave nodes and master nodes. Master node controls the tasks on the slave nodes. Every node is a different system and to maintain a connection between these nodes, SSH is used. SSH is mainly used so that the master node can stay connected with the slave nodes.
answered Jun 11, 2019 by Raman

Related Questions In Big Data Hadoop

0 votes
1 answer

Why we use 'help' command in Hadoop Sqoop?

Hi, The command sqoop help lists the tools ...READ MORE

answered Feb 4, 2020 in Big Data Hadoop by MD
• 95,440 points
638 views
0 votes
1 answer

why do we need MaPReduce in BigData Hadoop?

Hi, As we know Hadoop provides Hdfs as ...READ MORE

answered Feb 4, 2020 in Big Data Hadoop by MD
• 95,440 points
555 views
0 votes
1 answer

What do we exactly mean by “Hadoop” – the definition of Hadoop?

The official definition of Apache Hadoop given ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by Shubham
1,609 views
+1 vote
1 answer

Hadoop Installation Issue on Windows

Below is the main error you are ...READ MORE

answered Mar 26, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
5,684 views
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
10,559 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
104,218 views
–1 vote
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,390 points
4,261 views
0 votes
1 answer
0 votes
1 answer

Why do we need Hadoop framework?

The function of Distributes File System is ...READ MORE

answered Apr 10, 2019 in Big Data Hadoop by Gitika
• 65,910 points

edited Apr 12, 2019 by Gitika 1,132 views
0 votes
3 answers

Can we run Spark without using Hadoop?

No, you can run spark without hadoop. ...READ MORE

answered May 7, 2019 in Big Data Hadoop by pradeep
1,836 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP