Databricks extention with VScode, errors when running the notebook init script

61 views Asked by At

I got that error right after opening VScode, 

Output Window:

Errors in 00-databricks-init-***********9.py:

create_and_register_databricks_globals - ValueError: default auth: metadata-service: Expecting value: line 1 column 1 (char 0). Config: host=HOST_URL, auth_type=metadata-service, cluster_id=90-014-aj25745p, metadata_service_url=***. Env: DATABRICKS_HOST, DATABRICKS_AUTH_TYPE, DATABRICKS_CLUSTER_ID, DATABRICKS_METADATA_SERVICE_URL

I'm trying to find a start point, but didn't find one. 

  • Databricks CLI v0.212.3
  • databricks-connect==14.3.0
  • databricks-sdk==0.20.0
  • The host & token are set properly in .databrickscfg.
  • http/https proxy env variables are set properly too. 

that problem affects the execution of individual cells using databricks-connect,  it threw that error 

pyspark.errors.exceptions.connect.SparkConnectGrpcException: <_InactiveRpcError of RPC that terminated with: status = StatusCode.UNAVAILABLE details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:1**..137.3:3*: tcp handshaker shutdown" debug_error_string = "UNKNOWN:Error received from peer {created_time:"2024-02-20T11:09:21.235125+01:00", grpc_status:14, grpc_message:"failed to connect to all addresses; last error: UNKNOWN: ipv4:1.**.137.3:3*: tcp handshaker shutdown"}"

but running the files as workflow jobs on databricks works fine.  any solutions?

0

There are 0 answers