When developing locally it’s useful to set Spark host to a specific IP sometimes like this:
however since Spark 3.0 I’m getting the following error:
org.apache.spark.sql.AnalysisException: Cannot modify the value of a Spark config: spark.driver.host
spark.conf.set("spark.sql.legacy.setCommandRejectsSparkCoreConfs", False) spark.conf.set("spark.driver.host", "myhost")
To contact me, send an email anytime or leave a comment below.