shaded.databricks.org.apache.hadoop.fs.azure.AzureException: An exception while trying read from storage container in azure blob storage

SAL 0 Reputation points
2024-09-05T14:45:46.2066667+00:00

My first part of the code works fine which is below:

dbutils.widgets.text("AutoLoanFilePath", "")
inputPath = dbutils.widgets.get("AutoLoanFilePath")
inputPath = 'SEPT_2024/FAMILY_SECURITY'
autoPath = 'dbfs:/mnt/dbs_adls_mnt/Prod_landing/'+inputPath
autoLoanPath = autoPath+'/AUTOLOAN1.csv'
criteriaPath = autoPath+"/CRITERIA.csv"
newratePath = autoPath+"/NEWRATE.csv"
autoloanoutputPath = autoPath+"/AutoLoan1Processed.csv"
lsPath = '/mnt/dbs_adls_mnt/Prod_landing/'+inputPath
print(autoLoanPath)

But after that when I try to mount the directory as below:

%fs
ls /mnt/dbs_adls_mnt/Prod_landing

it is throwing shaded.databricks.org.apache.hadoop.fs.azure.AzureException: An exception while trying read from storage container in azure blob storage. Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,211 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,804 questions
{count} votes

1 answer

Sort by: Most helpful
  1. PRADEEPCHEEKATLA-MSFT 90,146 Reputation points Microsoft Employee
    2024-09-06T04:41:37.6966667+00:00

    @SAL - Thanks for the question and using MS Q&A platform.

    It seems like you are facing an authentication issue while trying to read from storage container in Azure Blob Storage. This error message usually occurs when the server fails to authenticate the request.

    To resolve this issue, you can try the following steps:

    • Check if the value of the Authorization header is formed correctly including the signature.
    • Make sure that the credentials you are using to authenticate the request are valid and have the required permissions to access the storage container.
    • Check if the storage account key or SAS token you are using to authenticate the request is still valid and has not expired.
    • Ensure that the firewall settings of your storage account are configured correctly to allow access from your Databricks cluster.

    In case, if you are still experiencing the issue I would suggest you to share the complete stack trace of the error message to identify the issue and share the exact solution.

    For more details, refer to Connect to Azure Data Lake Storage Gen2 and Blob Storage and you can also refer to the Azure documentation on troubleshooting common errors when using Azure Blob Storage for more information on how to troubleshoot this issue.

    Hope this helps. Do let us know if you any further queries.


    If this answers your query, do click Accept Answer and Yes for was this answer helpful. And, if you have any further query do let us know.

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.