How to get SQL configuration in Spark using Python

0 votes
Hi. Can someone help me get the Spark SQL configuration of my Spark Application? I am using for this. And I want to know how I can get the configuration details using python. Please help
Mar 18, 2019 in Apache Spark by Ganesh
478 views

1 answer to this question.

0 votes

You can get the configuration details through the Spark session. What you have to do is, first create a spark session and then use the set command to get the configuration details. For example:

from pyspark.sql import SparkSession
spark = SparkSession.builder.appName('abc').getOrCreate()
spark.sql("SET -v").show(n=200, truncate=False)
answered Mar 18, 2019 by John

Related Questions In Apache Spark

0 votes
1 answer

How to get Spark SQL configuration?

First create a Spark session like this: val ...READ MORE

answered Mar 18, 2019 in Apache Spark by John
1,516 views
0 votes
5 answers

How to change the spark Session configuration in Pyspark?

You aren't actually overwriting anything with this ...READ MORE

answered Dec 14, 2020 in Apache Spark by Gitika
• 65,970 points
67,488 views
0 votes
1 answer

How to get ID of a map task in Spark?

you can access task information using TaskContext: import org.apache.spark.TaskContext sc.parallelize(Seq[Int](), ...READ MORE

answered Nov 20, 2018 in Apache Spark by Frankie
• 9,810 points
2,048 views
0 votes
1 answer

How to use ftp scheme using Yarn in Spark application?

In case Yarn does not support schemes ...READ MORE

answered Mar 28, 2019 in Apache Spark by Raj
436 views
–1 vote
1 answer

Pyspark rdd How to get partition number in output ?

The glom function is what you are looking for: glom(self): ...READ MORE

answered Jan 8, 2019 in Python by Omkar
• 69,150 points
1,466 views
0 votes
0 answers

try except is not working while using hdfs command

Hi,  I am trying to run following things ...READ MORE

Mar 6, 2019 in Python by anonymous
393 views
+1 vote
2 answers
0 votes
2 answers
0 votes
1 answer

Get Spark SQL configuration in Java

You will need to use Spark session ...READ MORE

answered Mar 18, 2019 in Apache Spark by John
317 views
0 votes
1 answer

Using R to display configuration of Spark SQL

Try the below-mentioned code. sparkR.session() properties <- sql("SET -v") showDF(properties, ...READ MORE

answered Mar 18, 2019 in Apache Spark by John
118 views