Here is the code:
access_key = dbutils.secrets.get(scope = "dll-gcp", key = "aws-access-key")
secret_key = dbutils.secrets.get(scope = "dll-gcp", key = "aws-secret-key")
encoded_secret_key = secret_key.replace("/", "%2F")
aws_bucket_name = "ls-ind-export"
mount_name = "s3"
dbutils.fs.mount(f"s3a://{access_key}:{encoded_secret_key}@{aws_bucket_name}", f"/mnt/{mount_name}")
display(dbutils.fs.ls(f"/mnt/{mount_name}"))
ExecutionError: An error occurred while calling o353.mount. : java.rmi.RemoteException: java.lang.UnsupportedOperationException: Key based mount points are not supported.; nested exception is:
It's a known limitation of Databricks on GCP - in contrast to Azure & AWS, it doesn't support mounts using AWS keys, Azure service principals, etc. Instead it's recommended to access data directly via
s3a://.../...URL. Example from documentation: