env python No such file or directory in pyspark

0 votes

I am getting this below error when trying to run pyspark shell in my Linux system.

How can I solve this error.

Thank You

Apr 7, 2020 in Apache Spark by akhtar
• 38,170 points
1,916 views

1 answer to this question.

0 votes

Hi@akhtar,

This error occurs because your python version is not compatible with pyspark version. So check your python version and update accordingly using the below given command.

$ sudo  ln -s  /usr/bin/python3 /usr/bin/python

Hope this will help.

Thank You

answered Apr 7, 2020 by MD
• 95,060 points

Related Questions In Apache Spark

0 votes
1 answer

if i want to see my public key after running cat <path> command in gitbash but saying no such file or directory.

Hey, @KK, You can fix this issue may be ...READ MORE

answered May 26, 2020 in Apache Spark by Gitika
• 65,870 points
151 views
0 votes
1 answer

Facing issue while reading tsv file in pyspark

Hi@khyati, You are getting this type of output ...READ MORE

answered Sep 28, 2020 in Apache Spark by MD
• 95,060 points
340 views
0 votes
1 answer

Which query to use for better performance, join in SQL or using Dataset API?

DataFrames and SparkSQL performed almost about the ...READ MORE

answered Apr 19, 2018 in Apache Spark by kurt_cobain
• 9,390 points
589 views
0 votes
1 answer

Efficient way to read specific columns from parquet file in spark

As parquet is a column based storage ...READ MORE

answered Apr 20, 2018 in Apache Spark by kurt_cobain
• 9,390 points
3,639 views
+1 vote
2 answers
0 votes
1 answer

Is it possible to run Apache Spark without Hadoop?

Though Spark and Hadoop were the frameworks designed ...READ MORE

answered May 2, 2019 in Big Data Hadoop by ravikiran
• 4,620 points
319 views
0 votes
1 answer

Where can I get best spark tutorials for beginners?

Hi@akhtar There are lots of online courses available ...READ MORE

answered May 14, 2020 in Apache Spark by MD
• 95,060 points
197 views
0 votes
1 answer

env : R : No such file or directory

Hi@akhtar, I also got this error. I am able to ...READ MORE

answered Jul 21, 2020 in Apache Spark by MD
• 95,060 points
409 views
0 votes
1 answer