I'm trying to connect to my local server from databricks(community edition), but I get this error:
com.microsoft.sqlserver.jdbc.SQLServerException: The TCP/IP connection to the host 127.0.0.1, port 1433 has failed. Error: "Connection refused (Connection refused). Verify the connection properties. Make sure that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port. Make sure that TCP connections to the port are not blocked by a firewall."
I tried the recommendations from here
I enabled and set the TCP to 1433 and restarted the server, further enabled the IP address and check the connection to the port with telnet
this is the first code
table = (spark.read
.format("sqlserver")
.option("host", '127.0.0.1')
.option("port", "1433")
.option("user", user)
.option("password", passw)
.option("database", "TestDatabase")
.option("dbtable", "dbo.table1")
.load()
)
and the second
driver = "com.microsoft.sqlserver.jdbc.SQLServerDriver"
url = f"jdbc:sqlserver://127.0.0.1:1433;database=TestDatabase"#jdbc:sqlserver://{database_host}:{database_port};database={database_name}
table = (spark.read
.format("jdbc")
.option("driver", driver)
.option("url", url)
.option("dbtable", "dbo.table1")
.option("user", user)
.option("password", password)
.load()
)
In both cases I get the same error, for the host name I tried '127.0.0.1','localhost' and 'DESKTOP-N1VTIO2'

