Spark - Export DataFrame Schema, and then Import it Later.


During some execution I’ve ended up with a DataFrame which has a very specific schema I don’t know beforehand. I’d like to export this schema to disk, to be able to use it later.


Export Schema as JSON

json: str = df.schema.json()

Then save it somewhere.

Import Schema from JSON

import json
from pyspark.sql.types import StructType

json_object = json.loads(json_text)

schema = StructType.fromJson(json_object)

Thanks! You can always email me or use contact form for more questions/comments etc.