ImportError: No module named 'pyspark'

0 votes

Hi Guys,

I am trying to import pyspark in my jupyter notebook, but it shows me the below error.

ImportError: No module named 'pyspark'
May 6 in Apache Spark by akhtar
• 11,270 points
85 views

1 answer to this question.

0 votes

Hi@akhtar,

By default pyspark in not present in your normal python package. For that you have to install this module by your own. To install this you can follow the bellow command.

$ pip install pyspark

After that it will work.

answered May 6 by MD
• 24,500 points

Related Questions In Apache Spark

0 votes
1 answer

PySpark not starting: No active sparkcontext

Seems like Spark hadoop daemons are not ...READ MORE

answered Jul 30, 2019 in Apache Spark by Jishan
641 views
0 votes
4 answers

How to change the spark Session configuration in Pyspark?

You can dynamically load properties. First create ...READ MORE

answered Dec 10, 2018 in Apache Spark by Vini
30,892 views
0 votes
1 answer

How to add third party java jars for use in PySpark?

You can add external jars as arguments ...READ MORE

answered Jul 4, 2018 in Apache Spark by nitinrawat895
• 10,920 points
2,999 views
0 votes
1 answer

PySpark Config ?

Mainly, we use SparkConf because we need ...READ MORE

answered Jul 26, 2018 in Apache Spark by kurt_cobain
• 9,310 points
82 views
+1 vote
2 answers

how can i count the items in a list?

Syntax :            list. count(value) Code: colors = ['red', 'green', ...READ MORE

answered Jul 6, 2019 in Python by Neha
• 330 points

edited Jul 8, 2019 by Kalgi 963 views
0 votes
0 answers
+4 votes
6 answers
0 votes
1 answer

Error: No module named 'findspark'

Hi@akhtar, To import this module in your program, ...READ MORE

answered May 6 in Apache Spark by MD
• 24,500 points
92 views
0 votes
1 answer

env: ‘python’: No such file or directory in pyspark.

Hi@akhtar, This error occurs because your python version ...READ MORE

answered Apr 7 in Apache Spark by MD
• 24,500 points
204 views