How to get or set Databricks spark configuration
source link: https://jdhao.github.io/2023/05/13/databricks-spark-get-set-conf/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
How to get or set Databricks spark configuration
In this post, I summarize how to get or set a Databricks spark configuration/property.
get the value for a specific setting/configuration
To get all configurations in Python:
from pyspark.sql import SparkSession
spark = SparkSession.builder.getOrCreate()
all_conf = spark.sparkContext.getConf().getAll()
This will show all the configurations available. To get the value for a specific conf, e.g., for ‘spark.databricks.clusterUsageTags.region’, use the following code instead:
spark.conf.get("spark.databricks.clusterUsageTags.region")
refs:
Check Databricks version
To check the Databricks runtime version, use the following code:
from pyspark.sql import SparkSession
spark = SparkSession.builder.getOrCreate()
spark.conf.get("spark.databricks.clusterUsageTags.sparkVersion")
Recommend
About Joyk
Aggregate valuable and interesting links.
Joyk means Joy of geeK