Set Table Property (Metadata) in Spark or Databricks

Sometimes it’s useful to attach extra metadata to a table in Databricks (Spark), for instance I needed to store a table version. Theoretically you could create an external table but it becomes messy.

Luckily, you can use SET/UNSET and SHOW operations to achieve this. Below are the helper functions:

def set_table_property(spark: SparkSession, table_name: str, property_name: str, value: str) -> None:
    spark.sql(f"""ALTER TABLE {table_name} SET TBLPROPERTIES ("{property_name}" = "{value}")""")


def get_table_property(spark: SparkSession, table_name: str, property_name: str, default_value=None) -> str:
    df = spark.sql(f"""SHOW TBLPROPERTIES {table_name}""")
    row_list: List[Row] = df.collect()
    for row in row_list:
        key = str(row["key"])
        if key == property_name:
            value = str(row["value"])
            return value
    return default_value

Em, excuse me! Have Android 📱 and use Databricks? You might be interested in my totally free (and ad-free) Pocket Bricks . You can get it from Google Play too: Get it on Google Play


To contact me, send an email anytime or leave a comment below.