Azure Synapse Link - Not creating tables in Synapse Analytics

Daniel Pearson 20 Reputation points
2024-08-05T11:05:23.24+00:00

Hi,

I am changing from Export to Datalake via Dynamics to Azure Synapse Link as recommended by Microsoft since Export to Datalake is expiring this year. I can successfully select the FnO tables through the Azure Synpase Link which creates the data in csv format in my storage account.

This also created a serverless Lake Database in my Synapse Analytics workspace, however, the tables from the csv's are not being created in this database. All the documentation and videos i have seen on this suggest that this is an automatic process. I also have 'Connect to your Azure Synapse Analytics workspace' ticked but not the spark pool I don't need this.

Can someone help me get the tables to create in the Datalake database?

Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
4,858 questions
{count} votes

Accepted answer
  1. PRADEEPCHEEKATLA-MSFT 88,716 Reputation points Microsoft Employee
    2024-09-03T09:55:33.33+00:00

    @Daniel Pearson - I'm glad that you were able to resolve your issue and thank you for posting your solution so that others experiencing the same thing can easily reference this! Since the Microsoft Q&A community has a policy that "The question author cannot accept their own answer. They can only accept answers by others ", I'll repost your solution in case you'd like to accept the answer .

    Ask: Azure Synapse Link - Not creating tables in Synapse Analytics

    Solution: The issue is resolved. We've investigated this further and we have discovered that it is actually be enabling the spark pool option when setting up the azure synapse link which auto creates the tables in the datalake.

    If I missed anything please let me know and I'd be happy to add it to my answer, or feel free to comment below with any additional information.

    If you have any other questions, please let me know. Thank you again for your time and patience throughout this issue.


    Please don’t forget to Accept Answer and Yes for "was this answer helpful" wherever the information provided helps you, this can be beneficial to other community members.

    1 person found this answer helpful.
    0 comments No comments

2 additional answers

Sort by: Most helpful
  1. Amira Bedhiafi 23,096 Reputation points
    2024-08-05T16:02:15.6266667+00:00

    Can you verify if that the "Connect to your Azure Synapse Analytics workspace" option is enabled ?

    Double-check that you have the appropriate permissions for the storage account and the Synapse Analytics workspace.

    Next step, verify that the CSV files are correctly created in your storage account.

    If the tables are not being automatically created, you can manually create external tables in the serverless SQL pool:

    CREATE EXTERNAL TABLE [dbo].[YourTableName] (
        Column1 DataType,
        Column2 DataType,
        ...
    )
    WITH (
        LOCATION = 'path/to/your/csv/file.csv',
        DATA_SOURCE = YourDataSourceName,
        FILE_FORMAT = YourFileFormatName
    );
    
    

  2. PRADEEPCHEEKATLA-MSFT 88,716 Reputation points Microsoft Employee
    2024-08-08T10:38:35.8+00:00

    @Daniel Pearson - Thanks for the question and using MS Q&A platform.

    It sounds like you are having trouble creating tables in your serverless Lake Database in Synapse Analytics workspace using Azure Synapse Link. Let me see if I can help you with that.

    Firstly, it is correct that when you use Azure Synapse Link to export your Dynamics tables to Azure Data Lake Storage, it should automatically create a serverless Lake Database in your Synapse Analytics workspace. However, the tables from the CSV files are not automatically created in this database. You will need to create external tables in the serverless SQL pool to access the data in the CSV files.

    To create external tables, you can use T-SQL CREATE EXTERNAL TABLE command. Here is an example:

    CREATE EXTERNAL TABLE [dbo].[MyTable]
    (
        [Column1] [nvarchar](50) NOT NULL,
        [Column2] [nvarchar](50) NOT NULL,
        [Column3] [nvarchar](50) NOT NULL
    )
    WITH
    (
        LOCATION='/MyFolder/MyFile.csv',
        DATA_SOURCE = MyDataSource,
        FILE_FORMAT = MyFileFormat,
        REJECT_TYPE = VALUE,
        REJECT_VALUE = 0
    );
    
    

    In this example, MyTable is the name of the external table you want to create, and Column1, Column2, and Column3 are the columns in the CSV file. You will need to replace MyFolder/MyFile.csv with the location of your CSV file in your Azure Data Lake Storage, and MyDataSource and MyFileFormat with the names of your data source and file format.

    Follow-up question: What are the appropriate permissions for the storage account and the Synapse Analytics workspace? I'm not hugely familiar with the role assignments. Can you give me some suggestions and how to add?

    The Storage Blob Data Contributor role is a built-in role in Azure that provides read, write, and delete access to blob containers and data. This role is necessary for Synapse Analytics workspace to access data in Azure Data Lake Storage Gen2 (ADLS Gen2).

    When you assign the Storage Blob Data Contributor role to a user or service principal at the level of a container or storage account, they are granted read, write, and delete access to all of the blobs in that container or storage account. This role is required for Synapse Analytics workspace to access data in ADLS Gen2.

    In summary, the Storage Blob Data Contributor role is important for Synapse Analytics workspace to access data in ADLS Gen2, and it grants the minimum permission required to read, write, and delete containers and blobs. User's image

    Why do we need a Storage Blob Data Contributor role for Azure Synapse users?

    Note: Storage Blob Data Contributor: Use to grant read/write/delete permissions to Blob storage resources.

    If you are not assigning "Storage Blob Data Contributor" to other synapse users, they will be not able to access the data from ADLS gen2 due to the lack of permission on the storage account.

    If they try to access data from ADLS gen2 without the "Storage Blob Data Contributor" role on the storage account, they will receive the following error message:

    Error: org.apache.spark.sql.AnalysisException: java.lang.RuntimeException: Operation failed: \"This request is not authorized to perform this operation using this permission", 403, HEAD, https://storage_account.dfs.core.windows.net/synapse/tmp/hive?upn=false&action=getStatus&timeout=90;";
    

    For more details, refer to How to set up access control for your Azure Synapse workspace

    Hope this helps. Do let us know if you any further queries.


    If this answers your query, do click Accept Answer and Yes for was this answer helpful. And, if you have any further query do let us know.


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.