DataBricks JDBC "isolationLevel" Option Not Enabling Dirty Reads

119 views Asked by At

I need to read data from an Microsoft SQL Server database over JDBC from Databricks replicating the "With (NOLOCK)" SQL feature. Through official documentation the way to do this is to set the "isolationLevel" option to "READ_UNCOMMITTED". Although I have tried doing this when locking the table purposefully (on SQL Server) and then trying to read with the specified "isolationLevel" (on Databricks) the query always remains hanging until the table is unlocked.

Below is the Pyspark code I am using to try read on Databricks.

df = ( spark.read
        .format("jdbc")
        .option("isolationLevel", "READ_UNCOMMITTED")
        .option("url",jdbcUrl)
        .option("port", port)
        .option("user", user)
        .option("password", password)
        .option("database", db_name)
        .option('loginTimeout', 30)
        .option("dbtable",dbtable)
        .load())

df.write.format("delta").mode("overwrite").option("overwriteSchema",True).saveAsTable("test_table")

This is the SQL query I'm using to lock the target table on SQL SERVER.

BEGIN TRAN  
SELECT TOP (1) 1 FROM dbo.Test_Table WITH (TABLOCKX)
WAITFOR DELAY '00:02:00' 
ROLLBACK TRAN;

Is there something wrong with my code or is this a limitation of the JDBC driver/connection ?

0

There are 0 answers