Access denied error while setting up spark config in Python

40 views Asked by At

I am getting Failed to create directory /tmp/spark-a12d2057-a5ea-4fb9-876f-382d41153eab error while setting up spark config for an ETL application , I am running this application using a docker container. This is the code for spark config:-

def spark_config(JAR_FILE, APP_NAME):
    conf = SparkConf()
    conf.set("spark.jars", JAR_FILE)
    spark = SparkSession.builder.appName(APP_NAME) \
        .config("spark.sql.debug.maxToStringFields", 4000) \
        .config("mapreduce.fileoutputcommitter.marksuccessfuljobs", "false") \
        .config("spark.executor.memory", "8G") \
        .config("spark.driver.memory", "8G") \
        .config("spark.driver.maxResultSize", "15G") \
        .config("spark.sql.autoBroadcastJoinThreshold", -1) \
        .config(conf=conf) \
        .getOrCreate()

Error:-

 ERROR JavaUtils: Failed to create directory /tmp/spark-a12d2057-a5ea-4fb9-876f-382d41153eab
java.nio.file.AccessDeniedException: /tmp
Exception in thread "main" java.io.IOException: Failed to create a temp directory (under /tmp) after 10 attempts!

Any help or pointers will be appreciated!

0

There are 0 answers