py4j.protocol.Py4JError: org.apache.spark.api.python.PythonUtils.getEncryptionEnabled does not exist in the JVM

+1 vote

I tried to integrate python with pyspark, but I am gettting this below error.

20/04/06 10:46:17 WARN Utils: Your hostname, localhost.localdomain resolves to a loopback address:
127.0.0.1; using 10.0.2.15 instead (on interface enp0s3) 20/04/06 10:46:17 WARN Utils: Set SPARK_LOCAL_IP if
you need to bind to another address 20/04/06 10:46:18 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable Using Spark's default log4j
profile: org/apache/spark/log4j-defaults.properties Setting default log level to "WARN". To adjust logging
level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). Traceback (most recent call
last): File "program.py", line 5, in <module> sc = SparkContext.getOrCreate(SparkConf()) File
"/usr/local/lib/python3.6/site-packages/pyspark/context.py", line 367, in getOrCreate SparkContext(conf=conf
or SparkConf()) File "/usr/local/lib/python3.6/site-packages/pyspark/context.py", line 136, in __init__
conf, jsc, profiler_cls) File "/usr/local/lib/python3.6/site-packages/pyspark/context.py", line 213, in
_do_init self._encryption_enabled = self._jvm.PythonUtils.getEncryptionEnabled(self._jsc) File
\"/root/Desktop/pyspark/spark-3.0.0-preview2-bin-hadoop2.7-1/spark-3.0.0-preview2-bin-
hadoop2.7/python/lib/py4j-0.10.8.1-src.zip/py4j/java_gateway.py", line 1516, in __getattr__
py4j.protocol.Py4JError: org.apache.spark.api.python.PythonUtils.getEncryptionEnabled does not exist in the JVM


How can I solve this error?
Thank You

Apr 7 in Apache Spark by akhtar
• 10,580 points
993 views

1 answer to this question.

0 votes

Hi@akhtar,

This error may occur, if you don't set environment variable in .bashrc file. Set your python environment variable as follows.

export PYTHONPATH=$SPARK_HOME/python:$SPARK_HOME/python/lib/py4j-0.10.8.1-src.zip:$PYTHONPATH
export PATH=$SPARK_HOME/bin:$SPARK_HOME/python:$PATH

Hope this will solve your error.
Thank You

answered Apr 7 by MD
• 23,050 points
Where can I find the bashrc file?Location?

You can find .bashrc file in root(/root/.bashrc).

Related Questions In Apache Spark

0 votes
1 answer
0 votes
7 answers

How to print the contents of RDD in Apache Spark?

Save it to a text file: line.saveAsTextFile("alicia.txt") Print contains ...READ MORE

answered Dec 10, 2018 in Apache Spark by Akshay
25,629 views
+1 vote
3 answers

What is the difference between rdd and dataframes in Apache Spark ?

Comparison between Spark RDD vs DataFrame 1. Release ...READ MORE

answered Aug 27, 2018 in Apache Spark by shams
• 3,580 points
25,137 views
0 votes
1 answer

What is the difference between persist() and cache() in apache spark?

Hi, persist () allows the user to specify ...READ MORE

answered Jul 3, 2019 in Apache Spark by Gitika
• 29,290 points
1,367 views
+1 vote
2 answers
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,920 points
5,087 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,920 points
724 views
+1 vote
11 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyF ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
30,124 views
0 votes
1 answer

error: Caused by: org.apache.spark.SparkException: Failed to execute user defined function.

Hi@akhtar, I think you got this error due to version mismatch ...READ MORE

answered Apr 22 in Apache Spark by MD
• 23,050 points
74 views