ImportError No module named pyspark

0 votes

Hi Guys,

I am trying to import pyspark in my jupyter notebook, but it shows me the below error.

ImportError: No module named 'pyspark'
May 6, 2020 in Apache Spark by akhtar
• 38,220 points

1 answer to this question.

–1 vote


By default pyspark in not present in your normal python package. For that you have to install this module by your own. To install this you can follow the bellow command.

$ pip install pyspark

After that it will work.

To know more about it, get your Pyspark certification today and become expert.


answered May 6, 2020 by MD
• 95,320 points
Thanks for the info..It worked fine

Related Questions In Apache Spark

0 votes
1 answer

PySpark not starting: No active sparkcontext

Seems like Spark hadoop daemons are not ...READ MORE

answered Jul 30, 2019 in Apache Spark by Jishan
0 votes
5 answers

How to change the spark Session configuration in Pyspark?

You aren't actually overwriting anything with this ...READ MORE

answered Dec 14, 2020 in Apache Spark by Gitika
• 65,970 points
0 votes
1 answer

How to add third party java jars for use in PySpark?

You can add external jars as arguments ...READ MORE

answered Jul 4, 2018 in Apache Spark by nitinrawat895
• 11,380 points

edited Nov 19 by Sarfaraz 6,556 views
0 votes
1 answer

PySpark Config ?

Mainly, we use SparkConf because we need ...READ MORE

answered Jul 26, 2018 in Apache Spark by kurt_cobain
• 9,390 points
0 votes
2 answers
+1 vote
2 answers

how can i count the items in a list?

Syntax :            list. count(value) Code: colors = ['red', 'green', ...READ MORE

answered Jul 7, 2019 in Python by Neha
• 330 points

edited Jul 8, 2019 by Kalgi 2,486 views
0 votes
1 answer
+5 votes
6 answers

Lowercase in Python

You can simply the built-in function in ...READ MORE

answered Apr 11, 2018 in Python by hemant
• 5,810 points
0 votes
1 answer

Error: No module named 'findspark'

Hi@akhtar, To import this module in your program, ...READ MORE

answered May 6, 2020 in Apache Spark by MD
• 95,320 points
0 votes
1 answer

env: ‘python’: No such file or directory in pyspark.

Hi@akhtar, This error occurs because your python version ...READ MORE

answered Apr 7, 2020 in Apache Spark by MD
• 95,320 points