pyspark.sql.SparkSession.conf¶
-
property
SparkSession.
conf
¶ Runtime configuration interface for Spark.
This is the interface through which the user can get and set all Spark and Hadoop configurations that are relevant to Spark SQL. When getting the value of a config, this defaults to the value set in the underlying
SparkContext
, if any.New in version 2.0.0.
Changed in version 3.4.0: Supports Spark Connect.
- Returns
Examples
>>> spark.conf <pyspark...RuntimeConf...>
Set a runtime configuration for the session
>>> spark.conf.set("key", "value") >>> spark.conf.get("key") 'value'