ImportError No module named pyspark

0 votes

Hi Guys,

I am trying to import pyspark in my jupyter notebook, but it shows me the below error.

ImportError: No module named 'pyspark'
May 6, 2020 in Apache Spark by akhtar
• 38,210 points
7,679 views

1 answer to this question.

0 votes

Hi@akhtar,

By default pyspark in not present in your normal python package. For that you have to install this module by your own. To install this you can follow the bellow command.

$ pip install pyspark

After that it will work.

answered May 6, 2020 by MD
• 95,300 points
Thanks for the info..It worked fine

Related Questions In Apache Spark

0 votes
1 answer

PySpark not starting: No active sparkcontext

Seems like Spark hadoop daemons are not ...READ MORE

answered Jul 30, 2019 in Apache Spark by Jishan
1,949 views
0 votes
5 answers

How to change the spark Session configuration in Pyspark?

You aren't actually overwriting anything with this ...READ MORE

answered Dec 14, 2020 in Apache Spark by Gitika
• 65,950 points
64,879 views
0 votes
1 answer

How to add third party java jars for use in PySpark?

You can add external jars as arguments ...READ MORE

answered Jul 4, 2018 in Apache Spark by nitinrawat895
• 11,380 points
6,404 views
0 votes
1 answer

PySpark Config ?

Mainly, we use SparkConf because we need ...READ MORE

answered Jul 26, 2018 in Apache Spark by kurt_cobain
• 9,390 points
226 views
0 votes
2 answers
+1 vote
2 answers

how can i count the items in a list?

Syntax :            list. count(value) Code: colors = ['red', 'green', ...READ MORE

answered Jul 7, 2019 in Python by Neha
• 330 points

edited Jul 8, 2019 by Kalgi 2,401 views
0 votes
0 answers
+5 votes
6 answers

Lowercase in Python

You can simply the built-in function in ...READ MORE

answered Apr 11, 2018 in Python by hemant
• 5,810 points
1,801 views
0 votes
1 answer

Error: No module named 'findspark'

Hi@akhtar, To import this module in your program, ...READ MORE

answered May 6, 2020 in Apache Spark by MD
• 95,300 points
7,152 views
0 votes
1 answer

env: ‘python’: No such file or directory in pyspark.

Hi@akhtar, This error occurs because your python version ...READ MORE

answered Apr 7, 2020 in Apache Spark by MD
• 95,300 points
2,859 views