Hi@Ganendra,
I am not sure what's the issue, you probably need to do a bit of troubleshooting. You can check if the jar file mentioned above exists? If no, maybe you can try manually download them from the internet. If yes, check the owner and permission mask of the directories. Also, try to check the configuration files. As one of the common mistake is to set the HADOOP_CONF_DIR path.
Add HADOOP_CONF_DIR=/opt/hadoop-2.7.3/etc/hadoop/ to ./conf/spark-env.sh
Check your core-site.xml file with the below entries.
<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://masterip:port</value>
</property>
</configuration>
Also, remove the setMaster('local') in the source file.