Job for ssh.service failed because the control process exited with error code. See “systemctl status ssh.service” and “journalctl -xe” for details

0 votes

output for sudo dpkg --configure openssh-server : https://pastebin.com/GGa0uHAs

output for journalctl -xe : https://pastebin.com/Cgewh88r

how can i solve this?

server ssh 

Feb 6 in Linux Administration by Sanjana
• 140 points
1 flag 39 views

1 answer to this question.

0 votes

Hi@Sanjana,

I'm not sure this will work or not. But you can check your port no and if your port no shows busy, you can change it and try once.

For checking port no. you can use below command.

$ netstat -tnlp

Hope it will help you.
Thank You

answered 4 days ago by MD
• 2,750 points

Related Questions In Linux Administration

0 votes
1 answer

Error saying "Failed to start httpd.service: Unit httpd.service not found."

Execute the following steps or make sure ...READ MORE

answered May 13, 2019 in Linux Administration by Kiara
1,820 views
0 votes
1 answer

dpkg error: dpkg status database is locked by another process

First run: lsof /var/lib/dpkg/lock Then make sure that process ...READ MORE

answered Aug 27, 2019 in Linux Administration by Sirajul
• 48,530 points
171 views
0 votes
0 answers

Error while enabling the official puppet

I'm trying to enable the official puppet lab ...READ MORE

Feb 27, 2019 in Linux Administration by mytamhuyet
• 120 points

edited Apr 10, 2019 by mytamhuyet 37 views
0 votes
1 answer

Name or service not known error

Seems like there is no nameserver set. Run ...READ MORE

answered Apr 1, 2019 in Linux Administration by Rohit
578 views
+1 vote
3 answers

Error: dpkg frontend is locked by another process

First, find out the id of the process that ...READ MORE

answered Aug 7, 2019 in Linux Administration by Sirajul
• 48,530 points
21,905 views
0 votes
1 answer
0 votes
1 answer
0 votes
1 answer

How to run Map Reduce program using Ubuntu terminal?

 I used the following steps to execute it ...READ MORE

answered Aug 7, 2018 in Big Data Hadoop by Neha
• 6,280 points
333 views
0 votes
1 answer

Copy files to all Hadoop DFS directories

Hi @Bhavish. There is no Hadoop command ...READ MORE

answered Feb 23, 2019 in Big Data Hadoop by Omkar
• 68,940 points
699 views