How to get SQL configuration in Spark using Python

0 votes
Hi. Can someone help me get the Spark SQL configuration of my Spark Application? I am using for this. And I want to know how I can get the configuration details using python. Please help
Mar 18, 2019 in Apache Spark by Ganesh
959 views

1 answer to this question.

0 votes

You can get the configuration details through the Spark session. What you have to do is, first create a spark session and then use the set command to get the configuration details. For example:

from pyspark.sql import SparkSession
spark = SparkSession.builder.appName('abc').getOrCreate()
spark.sql("SET -v").show(n=200, truncate=False)
answered Mar 18, 2019 by John

Related Questions In Apache Spark

0 votes
1 answer

How to get Spark SQL configuration?

First create a Spark session like this: val ...READ MORE

answered Mar 18, 2019 in Apache Spark by John
3,113 views
0 votes
5 answers

How to change the spark Session configuration in Pyspark?

You aren't actually overwriting anything with this ...READ MORE

answered Dec 14, 2020 in Apache Spark by Gitika
• 65,910 points
121,590 views
0 votes
1 answer

How to get ID of a map task in Spark?

you can access task information using TaskContext: import org.apache.spark.TaskContext sc.parallelize(Seq[Int](), ...READ MORE

answered Nov 20, 2018 in Apache Spark by Frankie
• 9,830 points
3,046 views
0 votes
1 answer

How to use ftp scheme using Yarn in Spark application?

In case Yarn does not support schemes ...READ MORE

answered Mar 28, 2019 in Apache Spark by Raj
909 views
–1 vote
1 answer

Pyspark rdd How to get partition number in output ?

The glom function is what you are looking for: glom(self): ...READ MORE

answered Jan 8, 2019 in Python by Omkar
• 69,210 points
2,289 views
0 votes
0 answers

try except is not working while using hdfs command

Hi,  I am trying to run following things ...READ MORE

Mar 6, 2019 in Python by anonymous
934 views
+1 vote
2 answers
0 votes
2 answers
0 votes
1 answer

Get Spark SQL configuration in Java

You will need to use Spark session ...READ MORE

answered Mar 18, 2019 in Apache Spark by John
767 views
0 votes
1 answer

Using R to display configuration of Spark SQL

Try the below-mentioned code. sparkR.session() properties <- sql("SET -v") showDF(properties, ...READ MORE

answered Mar 18, 2019 in Apache Spark by John
381 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP