PySpark not starting: No active sparkcontext

0 votes

Hi,
I am not able to start a pyspark session in edureka VM session.

I am using the command "pyspark"
Error Message is :

"Connection Refused: localhost/127.0.0.1:7077"
No active sparkcontext

I didn't get the spark logo at the end.
Can you help me in resolving it?

Jul 30, 2019 in Apache Spark by Karan
830 views

1 answer to this question.

0 votes

Seems like Spark hadoop daemons are not running. Start it first and then start pyspark. Refer to the below commands:

$ cd /usr/lib/spark-2.1.1-bin-hadoop2.7

$ cd sbin

$ ./start-all.sh
answered Jul 30, 2019 by Jishan

Related Questions In Apache Spark

0 votes
1 answer

SparkContext.addFile() not able to update file.

Spark by default won't let you overwrite ...READ MORE

answered Mar 10, 2019 in Apache Spark by Siri
901 views
+1 vote
1 answer

Error: value textfile is not a member of org.apache.spark.SparkContext

Hi, Regarding this error, you just need to change ...READ MORE

answered Jul 4, 2019 in Apache Spark by Gitika
• 33,110 points
1,214 views
0 votes
1 answer

Spark Streaming Pyspark code not working

The address you are using in the ...READ MORE

answered Jul 11, 2019 in Apache Spark by Shir
647 views
0 votes
1 answer

env: ‘python’: No such file or directory in pyspark.

Hi@akhtar, This error occurs because your python version ...READ MORE

answered Apr 7 in Apache Spark by MD
• 41,020 points
528 views
+1 vote
2 answers
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,920 points
5,446 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,920 points
803 views
+1 vote
11 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyF ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
33,741 views
0 votes
4 answers

How to change the spark Session configuration in Pyspark?

You can dynamically load properties. First create ...READ MORE

answered Dec 10, 2018 in Apache Spark by Vini
35,538 views
–1 vote
1 answer

Not able to use sc in spark shell

Seems like master and worker are not ...READ MORE

answered Jan 3, 2019 in Apache Spark by Omkar
• 69,040 points
409 views