ImportError No module named pyspark

0 votes

Hi Guys,

I am trying to import pyspark in my jupyter notebook, but it shows me the below error.

ImportError: No module named 'pyspark'
May 6, 2020 in Apache Spark by akhtar
• 38,260 points
15,476 views

1 answer to this question.

–1 vote

Hi@akhtar,

By default pyspark in not present in your normal python package. For that you have to install this module by your own. To install this you can follow the bellow command.

$ pip install pyspark

After that it will work.

To know more about it, get your PySpark course today and become expert.

Thanks.

answered May 6, 2020 by MD
• 95,460 points
Thanks for the info..It worked fine

Related Questions In Apache Spark

0 votes
1 answer

PySpark not starting: No active sparkcontext

Seems like Spark hadoop daemons are not ...READ MORE

answered Jul 30, 2019 in Apache Spark by Jishan
4,670 views
0 votes
5 answers

How to change the spark Session configuration in Pyspark?

You aren't actually overwriting anything with this ...READ MORE

answered Dec 14, 2020 in Apache Spark by Gitika
• 65,890 points
125,429 views
0 votes
1 answer

How to add third party java jars for use in PySpark?

You can add external jars as arguments ...READ MORE

answered Jul 4, 2018 in Apache Spark by nitinrawat895
• 11,380 points

edited Nov 19, 2021 by Sarfaraz 8,676 views
0 votes
1 answer

PySpark Config ?

Mainly, we use SparkConf because we need ...READ MORE

answered Jul 26, 2018 in Apache Spark by kurt_cobain
• 9,390 points
774 views
0 votes
2 answers
+1 vote
2 answers

how can i count the items in a list?

Syntax :            list. count(value) Code: colors = ['red', 'green', ...READ MORE

answered Jul 7, 2019 in Python by Neha
• 330 points

edited Jul 8, 2019 by Kalgi 4,397 views
0 votes
1 answer
+5 votes
6 answers

Lowercase in Python

You can simply the built-in function in ...READ MORE

answered Apr 11, 2018 in Python by hemant
• 5,790 points
4,004 views
0 votes
1 answer

Error: No module named 'findspark'

Hi@akhtar, To import this module in your program, ...READ MORE

answered May 6, 2020 in Apache Spark by MD
• 95,460 points
20,456 views
0 votes
1 answer

env: ‘python’: No such file or directory in pyspark.

Hi@akhtar, This error occurs because your python version ...READ MORE

answered Apr 7, 2020 in Apache Spark by MD
• 95,460 points
6,379 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP