1. We will check whether master and worker node is active.
2. If not, as is the case in the above screenshot, we will manually start them.
Use the below commands in the terminal:
cd /usr/lib/spark2.1.1-bin-hadoop2.7
cd sbin
./start-all.sh
3. Now give the command spark-shell. Please refer below: