Column truncate issue - Copy file to DB ADF pipeline

ADF_Coder 0 Reputation points
2024-08-28T17:21:23.57+00:00

Hi All,

I have a pipeline where I am copying data from a file to a database. In the file, I have a few columns with extra spaces that I need to trim or truncate. I am not getting any option to use any transformation or function (dynamic content) on the destination side.

I have to do this in the ADF copy activity only.

Azure SQL Database
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,568 questions
{count} votes

1 answer

Sort by: Most helpful
  1. AnnuKumari-MSFT 32,816 Reputation points Microsoft Employee
    2024-08-29T07:58:52.1533333+00:00

    Hi ADF_Coder ,

    Thankyou for using Microsoft Q&A platform and thanks for posting your query here. As per my understanding you are facing issue while trying to load data from source to destination after performing the data transformation of trimming the data.

    Would appreciate if you could share more relevant details on your query like what is your source and target datastore. From the error message it looks like you are trying to use snowflake DB? Is that your target DB? What is the source ?

    To trim a string of trailing characters, you can make use of trim() function in ADF pipeline or rtrim() function in mapping dataflow in ADF .

    However, the error message seems more related to the snowflake connector, to resolve the same, kindly try setting the option strip_outer_array = true for file format and attempt the loading again. The considerations for loading large size semi-structured data are documented in the below article: https://docs.snowflake.com/en/user-guide/semistructured-considerations.html

    Hope it helps. Kindly let us know how it goes. Thankyou


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.