Spark Host Configuration is Rejected in Spark 3

When developing locally it’s useful to set Spark host to a specific IP sometimes like this:

spark.conf.set("spark.driver.host", "myhost")

however since Spark 3.0 I’m getting the following error:

org.apache.spark.sql.AnalysisException: Cannot modify the value of a Spark config: spark.driver.host

Quick fix:

spark.conf.set("spark.sql.legacy.setCommandRejectsSparkCoreConfs", False)
spark.conf.set("spark.driver.host", "myhost")


To contact me, send an email anytime or leave a comment below.