Databricks-Connect Error

For some reason I’m getting the following error when using Databricks-Connect. Thing is absolutely nothing has changed in my environment, and it used to work yesterday. Why is shard address invalid?

An error occurred while calling o28.csv.
: com.databricks.service.SparkServiceConnectionException: Invalid shard address: ""

To connect to a Databricks cluster, you must specify the URL of your Databricks shard.
Shard address: The URL of your shard (e.g., "")
  - Get current value: spark.conf.get("spark.databricks.service.address")
  - Set via conf: spark.conf.set("spark.databricks.service.address", <your shard address>)
  - Set via environment variable: export DATABRICKS_ADDRESS=<your shard address>
	at com.databricks.service.SparkServiceDebugHelper$.validateSparkServiceAddress(SparkServiceDebugHelper.scala:121)
	at com.databricks.service.SparkClientManager.$anonfun$getForSession$3(SparkClient.scala:384)
	at org.sparkproject.guava.cache.LocalCache$LocalManualCache$1.load(
	at org.sparkproject.guava.cache.LocalCache$LoadingValueReference.loadFuture(
	at org.sparkproject.guava.cache.LocalCache$Segment.loadSync(
	at org.sparkproject.guava.cache.LocalCache$Segment.lockedGetOrLoad(
	at org.sparkproject.guava.cache.LocalCache$Segment.get(
	at org.sparkproject.guava.cache.LocalCache.get(
	at org.sparkproject.guava.cache.LocalCache$LocalManualCache.get(
	at com.databricks.service.SparkClientManager.liftedTree1$1(SparkClient.scala:377)
	at com.databricks.service.SparkClientManager.getForSession(SparkClient.scala:376)
	at com.databricks.service.SparkClientManager.getForSession$(SparkClient.scala:353)
	at com.databricks.service.SparkClientManager$.getForSession(SparkClient.scala:401)
	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:292)
	at org.apache.spark.sql.DataFrameReader.csv(DataFrameReader.scala:715)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(
	at java.base/java.lang.reflect.Method.invoke(
	at py4j.reflection.MethodInvoker.invoke(
	at py4j.reflection.ReflectionEngine.invoke(
	at py4j.Gateway.invoke(
	at py4j.commands.AbstractCommand.invokeMethod(
	at py4j.commands.CallCommand.execute(
	at py4j.ClientServerConnection.waitForCommands(
	at java.base/
Caused by: org.apache.http.client.HttpResponseException: status code: 404
	at com.databricks.service.DBAPIClient.get(DBAPIClient.scala:101)
	at com.databricks.service.SparkServiceDebugHelper$.validateSparkServiceAddress(SparkServiceDebugHelper.scala:118)
	... 26 more

After a lot of poking around I found the issue - shard address must not end with a forward slash. This used to work in earlier versions of databricks connect but not the latest stable one (10.4 LTS).

All good after running databricks-connect test now:

* PySpark is installed at ...
* Checking SPARK_HOME
* Checking java version
openjdk version "11.0.15" 2022-04-19 LTS
OpenJDK Runtime Environment Corretto- (build 11.0.15+9-LTS)
OpenJDK 64-Bit Server VM Corretto- (build 11.0.15+9-LTS, mixed mode)
WARNING: Java versions >8 are not supported by this SDK
* Skipping scala command test on Windows
* Testing python command
22/06/23 11:09:47 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
22/06/23 11:09:50 WARN MetricsSystem: Using default name SparkStatusTracker for source because neither spark.metrics.namespace nor is set.
View job details at ...
* Simple PySpark test passed                                        (0 + 8) / 8]
* Testing dbutils.fs
* Simple dbutils test passed
* All tests passed.

Em, excuse me! Have Android 📱 and use Databricks? You might be interested in my totally free (and ad-free) Pocket Bricks . You can get it from Google Play too: Get it on Google Play

Thanks! You can always email me or use contact form for more questions/comments etc.