How to get SQL configuration in Spark using Python?

0 votes
Hi. Can someone help me get the Spark SQL configuration of my Spark Application? I am using for this. And I want to know how I can get the configuration details using python. Please help
Mar 18 in Apache Spark by Ganesh
22 views

1 answer to this question.

Your answer

Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
0 votes

You can get the configuration details through the Spark session. What you have to do is, first create a spark session and then use the set command to get the configuration details. For example:

from pyspark.sql import SparkSession
spark = SparkSession.builder.appName('abc').getOrCreate()
spark.sql("SET -v").show(n=200, truncate=False)
answered Mar 18 by John

Related Questions In Apache Spark

0 votes
1 answer

How to get Spark SQL configuration?

First create a Spark session like this: val ...READ MORE

answered Mar 18 in Apache Spark by John
20 views
0 votes
4 answers

How to change the spark Session configuration in Pyspark?

You can dynamically load properties. First create ...READ MORE

answered Dec 10, 2018 in Apache Spark by Vini
7,667 views
0 votes
1 answer

How to get ID of a map task in Spark?

you can access task information using TaskContext: import org.apache.spark.TaskContext sc.parallelize(Seq[Int](), ...READ MORE

answered Nov 20, 2018 in Apache Spark by Frankie
• 9,590 points
122 views
0 votes
1 answer

How to use ftp scheme using Yarn in Spark application?

In case Yarn does not support schemes ...READ MORE

answered Mar 28 in Apache Spark by Raj
40 views
0 votes
1 answer

Pyspark rdd How to get partition number in output ?

The glom function is what you are looking for: glom(self): ...READ MORE

answered Jan 8 in Python by Omkar
• 66,050 points
66 views
0 votes
0 answers

try except is not working while using hdfs command

Hi,  I am trying to run following things ...READ MORE

Mar 6 in Python by anonymous
17 views
0 votes
0 answers
0 votes
1 answer

Get Spark SQL configuration in Java

You will need to use Spark session ...READ MORE

answered Mar 18 in Apache Spark by John
10 views
0 votes
1 answer

Using R to display configuration of Spark SQL

Try the below-mentioned code. sparkR.session() properties <- sql("SET -v") showDF(properties, ...READ MORE

answered Mar 18 in Apache Spark by John
7 views

© 2018 Brain4ce Education Solutions Pvt. Ltd. All rights Reserved.
"PMP®","PMI®", "PMI-ACP®" and "PMBOK®" are registered marks of the Project Management Institute, Inc. MongoDB®, Mongo and the leaf logo are the registered trademarks of MongoDB, Inc.