ImportError: No module named 'pyspark'

0 votes

Hi Guys,

I am trying to import pyspark in my jupyter notebook, but it shows me the below error.

ImportError: No module named 'pyspark'
May 6 in Apache Spark by akhtar
• 25,030 points

1 answer to this question.

0 votes


By default pyspark in not present in your normal python package. For that you have to install this module by your own. To install this you can follow the bellow command.

$ pip install pyspark

After that it will work.

answered May 6 by MD
• 56,480 points

Related Questions In Apache Spark

0 votes
1 answer

PySpark not starting: No active sparkcontext

Seems like Spark hadoop daemons are not ...READ MORE

answered Jul 30, 2019 in Apache Spark by Jishan
0 votes
4 answers

How to change the spark Session configuration in Pyspark?

You can dynamically load properties. First create ...READ MORE

answered Dec 10, 2018 in Apache Spark by Vini
0 votes
1 answer

How to add third party java jars for use in PySpark?

You can add external jars as arguments ...READ MORE

answered Jul 4, 2018 in Apache Spark by nitinrawat895
• 10,950 points
0 votes
1 answer

PySpark Config ?

Mainly, we use SparkConf because we need ...READ MORE

answered Jul 26, 2018 in Apache Spark by kurt_cobain
• 9,320 points
+1 vote
2 answers

how can i count the items in a list?

Syntax :            list. count(value) Code: colors = ['red', 'green', ...READ MORE

answered Jul 6, 2019 in Python by Neha
• 330 points

edited Jul 8, 2019 by Kalgi 1,357 views
0 votes
0 answers
+4 votes
6 answers

Lowercase in Python

You can simply the built-in function in ...READ MORE

answered Apr 11, 2018 in Python by hemant
• 5,800 points
0 votes
1 answer

Error: No module named 'findspark'

Hi@akhtar, To import this module in your program, ...READ MORE

answered May 6 in Apache Spark by MD
• 56,480 points
0 votes
1 answer

env: ‘python’: No such file or directory in pyspark.

Hi@akhtar, This error occurs because your python version ...READ MORE

answered Apr 7 in Apache Spark by MD
• 56,480 points