Dynamic Mapping of JSON Sink Data in ADF

Vinit Joshi 5 Reputation points
2024-09-06T12:06:58.31+00:00

Hello Microsoft Community,

I hope you're all doing well. I'm currently working on a project in Azure Data Factory and need some advice!
We are successfully able to preview source data from the Salesforce Linked Service and sink data using the REST Linked Service for Business Central in Azure Data Factory (ADF). However, we are facing an issue with schema import since the sink data is in JSON format.

Could anyone suggest an approach for dynamically mapping the source data to the sink using the 'Add Dynamic Content' feature in ADF?

Thanks in advance!

Vinit.

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,584 questions
Microsoft Entra ID
Microsoft Entra ID
A Microsoft Entra identity service that provides identity management and access control capabilities. Replaces Azure Active Directory.
21,420 questions
Dynamics 365 Training
Dynamics 365 Training
Dynamics 365: A Microsoft cloud-based business platform that provides customer relationship management and enterprise resource planning solutions.Training: Instruction to develop new skills.
122 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Amira Bedhiafi 23,251 Reputation points
    2024-09-06T13:12:41.8466667+00:00

    Use a Copy Activity to transfer data from Salesforce (or any source) to Business Central via REST API.

    While configuring the Sink side, you'll have to map the fields dynamically using dynamic expressions under the Mapping tab.

    In the Sink section of the Copy Activity, click on Mappings. This is where you can map source fields to the sink (JSON format in your case).

    Instead of hardcoding the mappings, select the Add Dynamic Content feature to create flexible mappings based on runtime variables or parameters.

    Define parameters in your pipeline for source column names and target JSON field names.

    Use these parameters inside your mappings by referring to them in the Add Dynamic Content window.

    Example:

    @pipeline().parameters.targetField

    Since your sink is in JSON format, you can use Data Flow instead of direct copy to handle more complex transformations and schema alignment.

    Within the Data Flow, you can use the Derived Column transformation to restructure or rename the columns as per the target's JSON structure.

    You can also use the Flatten transformation if your source contains hierarchical data that needs to be transformed into a flat JSON structure for Business Central.

    Make sure your REST linked service in ADF is correctly configured for POST or PUT methods, depending on the API requirements of Business Central.

    If you need to pass the JSON dynamically, you can use Web Activity in combination with a Copy Activity, or you can configure your REST API dataset to accept dynamic content (via expressions) for both headers and body.


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.