PySpark not starting: No active sparkcontext

0 votes

Hi,
I am not able to start a pyspark session in edureka VM session.

I am using the command "pyspark"
Error Message is :

"Connection Refused: localhost/127.0.0.1:7077"
No active sparkcontext

I didn't get the spark logo at the end.
Can you help me in resolving it?

Jul 30 in Apache Spark by Karan
14 views

1 answer to this question.

0 votes

Seems like Spark hadoop daemons are not running. Start it first and then start pyspark. Refer to the below commands:

$ cd /usr/lib/spark-2.1.1-bin-hadoop2.7

$ cd sbin

$ ./start-all.sh
answered Jul 30 by Jishan

Related Questions In Apache Spark

0 votes
1 answer

SparkContext.addFile() not able to update file.

Spark by default won't let you overwrite ...READ MORE

answered Mar 10 in Apache Spark by Siri
120 views
0 votes
1 answer

Error: value textfile is not a member of org.apache.spark.SparkContext

Hi, Regarding this error, you just need to change ...READ MORE

answered Jul 4 in Apache Spark by Gitika
• 25,300 points
62 views
0 votes
1 answer

Spark Streaming Pyspark code not working

The address you are using in the ...READ MORE

answered Jul 11 in Apache Spark by Shir
25 views
0 votes
1 answer

When not to use foreachPartition and mapPartition?

With mapPartion() or foreachPartition(), you can only ...READ MORE

answered Apr 30, 2018 in Apache Spark by Data_Nerd
• 2,360 points
1,988 views
0 votes
1 answer
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,490 points
2,314 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,490 points
237 views
0 votes
10 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyFr ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
11,915 views
0 votes
4 answers

How to change the spark Session configuration in Pyspark?

You can dynamically load properties. First create ...READ MORE

answered Dec 10, 2018 in Apache Spark by Vini
11,948 views
0 votes
1 answer

Not able to use sc in spark shell

Seems like master and worker are not ...READ MORE

answered Jan 3 in Apache Spark by Omkar
• 67,290 points
117 views