Databricks access to ADLS Gen 1 using certificate

Chand, Anupam SBOBNG-ITA/RX 461 Reputation points
2021-05-31T14:02:29.34+00:00

I've referred to the databricks documentation https://docs.databricks.com/data/data-sources/azure/azure-datalake.html#language-scala which talks about how to mount ADLS gen 1 using client id and secret. I wanted to check if doing the same using certificate is possible right now or not.

I found a stackoverflow question(https://stackoverflow.com/questions/63684312/using-client-certificate-instead-of-client-secret-to-authenticate-against-datala) which says that this is currently not possible for ADLS gen 2 but there is no information on whether it is possible or not for gen 1.

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,465 questions
Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,163 questions
Microsoft Entra ID
Microsoft Entra ID
A Microsoft Entra identity service that provides identity management and access control capabilities. Replaces Azure Active Directory.
21,426 questions
0 comments No comments
{count} votes

Accepted answer
  1. PRADEEPCHEEKATLA-MSFT 88,791 Reputation points Microsoft Employee
    2021-06-01T03:13:30.99+00:00

    Hello anonymous user,

    Thanks for the question and using MS Q&A platform.

    Unfortunately, it is not supported to mount the Azure Data Lake Gen1 store via the client certificate.

    I would suggest you to provide feedback on the same:

    https://feedback.azure.com/forums/909463-azure-databricks

    All of the feedback you share in these forums will be monitored and reviewed by the Microsoft engineering teams responsible for building Azure.

    Currently, there are three ways of accessing Azure Data Lake Storage Gen1:

    • Pass your Azure Active Directory credentials, also known as credential passthrough.
    • Mount an Azure Data Lake Storage Gen1 filesystem to DBFS using a service principal and OAuth 2.0.
    • Use a service principal directly.

    For more details, refer to Azure Databricks – Azure Data Lake Storage Gen1.

    Hope this helps. Do let us know if you any further queries.

    ---------------------------------------------------------------------------

    Please "Accept the answer" if the information helped you. This will help us and others in the community as well.

    1 person found this answer helpful.

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.