How to get SQL configuration in Spark using Python?

0 votes
Hi. Can someone help me get the Spark SQL configuration of my Spark Application? I am using for this. And I want to know how I can get the configuration details using python. Please help
Mar 18 in Apache Spark by Ganesh
92 views

1 answer to this question.

0 votes

You can get the configuration details through the Spark session. What you have to do is, first create a spark session and then use the set command to get the configuration details. For example:

from pyspark.sql import SparkSession
spark = SparkSession.builder.appName('abc').getOrCreate()
spark.sql("SET -v").show(n=200, truncate=False)
answered Mar 18 by John

Related Questions In Apache Spark

0 votes
1 answer

How to get Spark SQL configuration?

First create a Spark session like this: val ...READ MORE

answered Mar 18 in Apache Spark by John
167 views
0 votes
4 answers

How to change the spark Session configuration in Pyspark?

You can dynamically load properties. First create ...READ MORE

answered Dec 10, 2018 in Apache Spark by Vini
17,992 views
0 votes
1 answer

How to get ID of a map task in Spark?

you can access task information using TaskContext: import org.apache.spark.TaskContext sc.parallelize(Seq[Int](), ...READ MORE

answered Nov 20, 2018 in Apache Spark by Frankie
• 9,810 points
553 views
0 votes
1 answer

How to use ftp scheme using Yarn in Spark application?

In case Yarn does not support schemes ...READ MORE

answered Mar 28 in Apache Spark by Raj
150 views
–1 vote
1 answer

Pyspark rdd How to get partition number in output ?

The glom function is what you are looking for: glom(self): ...READ MORE

answered Jan 8 in Python by Omkar
• 68,480 points
321 views
0 votes
0 answers

try except is not working while using hdfs command

Hi,  I am trying to run following things ...READ MORE

Mar 6 in Python by anonymous
60 views
+1 vote
1 answer
0 votes
1 answer

Get Spark SQL configuration in Java

You will need to use Spark session ...READ MORE

answered Mar 18 in Apache Spark by John
53 views
0 votes
1 answer

Using R to display configuration of Spark SQL

Try the below-mentioned code. sparkR.session() properties <- sql("SET -v") showDF(properties, ...READ MORE

answered Mar 18 in Apache Spark by John
23 views