Pipeline status = success but no data was written to the sink

Mick Jaeger 0 Reputation points
2024-09-05T23:37:55.33+00:00

Pipeline Run ID: 844b967c-aeb1-4dd7-9ba2-14b7a34d85cc

Hi,

I am trying to debug my ADF Pipeline and cannot see useful details. Can you please help me debug this pipeline?

Specifically, I have 4 objects in the pipeline that are looped through. Three of them are working as expected, the 4th is not. The issue is with 'CopyData OrderPayments' object.

It's impossible to determine which loop corresponds to which source/sink.

Can you please help with this?

Thank you,
Mick

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,804 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Vinodh247 22,871 Reputation points
    2024-09-06T06:55:45.64+00:00

    To debug this issue in Azure Data Factory, especially when dealing with loops, you can try the following steps to isolate the problem first:

    • Use variables: If the loop is iterating through a list, you can use a variable to capture which object (or iteration) is currently being processed. This can be done within the loop using a Set Variable activity. Add a Set Variable activity that writes out the name or identifier of the object being processed before each Copy Data activity. You can log this to help you trace which iteration corresponds to which source/sink.
    • Go to the Monitor tab in ADF, find your pipeline run, and drill down into the 'CopyData OrderPayments' activity. Look for any warnings or errors in the details, and check if rows were read but not written. If you’re running the pipeline in debug mode, you can preview the data that is being copied in the 'Copy Data' activity to see if it’s pulling the expected data.
    • Ensure that the source to sink mapping is correctly configured for the 'CopyData OrderPayments' activity. Sometimes, issues with mapping, especially in complex pipelines with dynamic schemas, can prevent data from being written to the sink. Validate if the sink destination (ex: storage account, database) is correctly set up and accessible
    • When looking at the pipeline monitoring logs, under the 'Activity Runs' tab, you'll see the individual iterations. You can click into each loop's 'Copy Data' activity to see which iteration ran and what the input/output details were.
    • Ensure there are no throttling issues or timeouts that might have prevented the data from being written to the sink. You can also check if the source data contains the expected data (ex., orderpayments), as it’s possible that the source data is empty.
    • Review any conditional filters applied to the sink operation that may block data writes, such as constraints, empty data filtering, or triggers based on data content.

    share any error logs or outputs from the failed activity, this will also help in narrowing down the issue.


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.