Storage connection strings
Applies to: ✅ Microsoft Fabric ✅ Azure Data Explorer
The kusto service can interact with external storage services. For example, you can create an Azure Storage external tables in order to query data stored on external storages.
The following types of external storage are supported:
- Azure Blob Storage
- Azure Data Lake Storage Gen2
- Azure Data Lake Storage Gen1
- Amazon S3
Each type of storage has corresponding connection string formats used to describe the storage resources and how to access them. A URI format is used to describe these storage resources and the properties necessary to access them, such as security credentials.
Nota
HTTP web services support is limited to retrieving resources from arbitrary HTTP web services. Other operations, such as writing resources, are not supported.
Each storage type has a different connection string format. See the following table for connection string templates for each storage type.
Storage Type | Scheme | URI template |
---|---|---|
Azure Blob Storage | https:// |
https:// StorageAccountName.blob.core.windows.net/ Container[/ BlobName][CallerCredentials] |
Azure Data Lake Storage Gen2 | https:// |
https:// StorageAccountName.dfs.core.windows.net/ Filesystem[/ PathToDirectoryOrFile][CallerCredentials] |
Azure Data Lake Storage Gen2 | abfss:// |
abfss:// Filesystem@ StorageAccountName.dfs.core.windows.net/ [PathToDirectoryOrFile][CallerCredentials] |
Azure Data Lake Storage Gen1 | adl:// |
adl:// StorageAccountName.azuredatalakestore.net/PathToDirectoryOrFile[CallerCredentials] |
Amazon S3 | https:// |
https:// BucketName.s3. RegionName.amazonaws.com/ ObjectKey[CallerCredentials] |
HTTP web services | https:// |
https:// Hostname/ PathAndQuery |
Nota
To prevent secrets from showing up in traces, use obfuscated string literals.
To interact with nonpublic external storage, you must specify authentication means as part of the external storage connection string. The connection string defines the resource to access and its authentication information.
The following authentication methods are supported:
- Shared Access (SAS) key
- Microsoft Entra access token
- Storage account access key
- Amazon Web Services Programmatic Access Keys
- Amazon Web Services S3 presigned URL
The following table summarizes the available authentication methods for different external storage types.
Authentication method | Available in Blob storage? | Available in Azure Data Lake Storage Gen 2? | Available in Azure Data Lake Storage Gen 1? | Available in Amazon S3? | When should you use this method? |
---|---|---|---|---|---|
Impersonation | ✔️ | ✔️ | ✔️ | ❌ | Use for attended flows when you need complex access control over the external storage. For example, in continuous export flows. You can also restrict storage access at the user level. |
Managed identity | ✔️ | ✔️ | ✔️ | ❌ | Use in unattended flows, where no Microsoft Entra principal can be derived to execute queries and commands. Managed identities are the only authentication solution. |
Shared Access (SAS) key | ✔️ | ✔️ | ❌ | ❌ | SAS tokens have an expiration time. Use when accessing storage for a limited time. |
Microsoft Entra access token | ✔️ | ✔️ | ✔️ | ❌ | Microsoft Entra tokens have an expiration time. Use when accessing storage for a limited time. |
Storage account access key | ✔️ | ✔️ | ❌ | ❌ | When you need to access resources on an ongoing basis. |
Amazon Web Services Programmatic Access Keys | ❌ | ❌ | ❌ | ✔️ | When you need to access Amazon S3 resources on an ongoing basis. |
Amazon Web Services S3 presigned URL | ❌ | ❌ | ❌ | ✔️ | When you need to access Amazon S3 resources with a temp presigned URL. |
Kusto impersonates the requestor's principal identity to access the resource. To use impersonation, append ;impersonate
to the connection string.
Example |
---|
"https://fabrikam.blob.core.windows.net/container/path/to/file.csv;impersonate" |
The principal must have the necessary permissions to perform the operation. For example in Azure Blob Storage, to read from the blob the principal needs the Storage Blob Data Reader role and to export to the blob the principal needs the Storage Blob Data Contributor role. To learn more, see Azure Blob Storage / Data Lake Storage Gen2 access control or Data Lake Storage Gen1 access control.
Azure Data Explorer makes requests on behalf of a managed identity and uses its identity to access resources. For a system-assigned managed identity, append ;managed_identity=system
to the connection string. For a user-assigned managed identity, append ;managed_identity={object_id}
to the connection string.
Managed identity type | Example |
---|---|
System-assigned | "https://fabrikam.blob.core.windows.net/container/path/to/file.csv;managed_identity=system" |
User-assigned | "https://fabrikam.blob.core.windows.net/container/path/to/file.csv;managed_identity=12345678-1234-1234-1234-1234567890ab" |
The managed identity must have the necessary permissions to perform the operation. For example in Azure Blob Storage, to read from the blob the managed identity needs the Storage Blob Data Reader role and to export to the blob the managed identity needs the Storage Blob Data Contributor role. To learn more, see Azure Blob Storage / Data Lake Storage Gen2 access control or Data Lake Storage Gen1 access control.
Nota
Managed identity is only supported in specific Azure Data Explorer flows and requires setting up the managed identity policy. For more information, see Managed identities overview.
In the Azure portal, generate a SAS token with the required permissions.
For example, to read from the external storage specify the Read and List permissions and to export to the external storage specify the Write permissions. To learn more, see delegate access by using a shared access signature.
Use the SAS URL as the connection string.
Example |
---|
"https://fabrikam.blob.core.windows.net/container/path/to/file.csv?sv=...&sp=rwd" |
To add a base-64 encoded Microsoft Entra access token, append ;token={AadToken}
to the connection string. The token must be for the resource https://storage.azure.com/
.
For more information on how to generate a Microsoft Entra access token, see get an access token for authorization.
Example |
---|
"https://fabrikam.blob.core.windows.net/container/path/to/file.csv;token=1234567890abcdef1234567890abcdef1234567890abc..." |
To add a storage account access key, append the key to the connection string. In Azure Blob Storage, append ;{key}
to the connection string. For Azure Data Lake Storage Gen 2, append ;sharedkey={key}
to the connection string.
Storage account | Example |
---|---|
Azure Blob Storage | "https://fabrikam.blob.core.windows.net/container/path/to/file.csv;ljkAkl...==" |
Azure Data Lake Storage Gen2 | "abfss://fs@fabrikam.dfs.core.windows.net/path/to/file.csv;sharedkey=sv=...&sp=rwd" |
To add Amazon Web Services access keys, append ;AwsCredentials={ACCESS_KEY_ID},{SECRET_ACCESS_KEY}
to the connection string.
Example |
---|
"https://yourbucketname.s3.us-east-1.amazonaws.com/path/to/file.csv;AwsCredentials=AWS1234567890EXAMPLE,1234567890abc/1234567/12345678EXAMPLEKEY" |
Use the S3 presigned URL as the connection string.
Example |
---|
"https://yourbucketname.s3.us-east-1.amazonaws.com/file.csv?12345678PRESIGNEDTOKEN" |