Data Factory copy activity writes a huge file from a tiny file : throttling error ?
Hello, A copy activity has written very large files (over 1TB) when the files should have been tiny (less than 1MB). Here are the details : The activity copy has an SFTP source, binary, and ADLS Gen 2 sink, binary as well. The files have a…
Copy Activity writing an empty value from REST API to SQL DW when preview is correct
Hi, I have an copy activity to copy an value from an REST API to an SQL DW. I configured the source and I can even get a correct preview of the value and correct mapping (mapped the name of the value I wanted to the correct sink on the DW by using the…
Can we trigger pipeline on first business day of each month to exclude holidays?
Hi Can we exclude public holidays to trigger Data Factory pipeline ? If not can we achive in Logic apps so that I can call pipeline in a logic app? Also, I want to consider day light saivings to trigger pipeline instead of UTC. Thanks in advance.…
Loading Dynamics CDM with Azure Data Factory (@entityReference)
Using Azure Data Factory to Sink to Dynamics Common Data Model. Source: Reading from MySQL Col1: dpp_authorityid Col2: dpp_acitivitygroup Destination - CDM - dpp_authority Col1: dpp_authorityid Col2: dpp_acitivitygroup (this is Lookup…
How to deal with long running Azure Functions in Azure Data Factory
I have a Function (HttpTrigger) that can take a bit to run 5-10 minutes. It reads some information from the body of the request to perform some actions: [FunctionName("myFunc")] public static async Task<IActionResult> Run( …
ADF and data sources of Azure Cloud and On-Prem
All, We need to bring in data into Azure Cloud from external sources that are not in our On-Prem infrastructure. For any data sources inside on-prem we would use Integration Runtime. But considering that it is from a 3rd party vendor which proved is…
Multi Tenant and Event Grid trigger.
Respected, We have our website which supports multiple tenants. We are designing Azure Data Factory where in, each tenant will upload a file in Azure Blob Storage that need to processed and once the file is uploaded then the trigger configured…
Loading files with ever changing columns
Hi All, I am using ADF to load files from a sftp site into ADLS. The issue is the incoming files are not consistent , column wise. Some times they add columns and there is no way we get notified. Is there a way in ADF where it automatically copies the…
JSON object to array of nested JSON
I have a JSON object in cosmos DB(source) which I need to transform into an array of nested JSON objects(similar to source objects)in Cosmos DB(sink). Using the derived column I am facing an error during typecast of complex json to array. Could I get an…
Data factory pipeline shows success when copying csv to sql azure but data does not appear in the table
When running a data factory pipeline to copy a csv file to sql azure, the job shows as a success and confirms that the proper amount of rows were copied but when querying the table in the database, the data does not show.
Azure Data Factory - Paypal - Lookup error
I am trying to execute a lookup to read records from Paypal: This is the query I am running Select Id,Transactions_Index,Sale_Create_Time,Sale_Parent_Payment,Sale_Amount_Details_Subtotal, Sale_Id from Payment_Transactions_Sale Where Sale_Create_Time…
Azure data factory - query pipeline by data annotations
Is there a way to programmatically query (using .net SDK) list of data factory pipelines by data annotations? I can set data annotation when I run pipeline as explained here, but not sure how to query pipelines or filter pipelines using the same?…
How we can get uptime for all the resources in azure.
Hi guys, I have to check the uptime for the resources in Azure. As for some resources, we can check directly from the Azure portal. I'm able to see uptime for the storage account and SQL database but how can I get the uptime for other resources such…
Expiry Datetime incorrect for Tumbling Window Trigger Catchup Run?
I have a pipeline that primarily has 2 copy activities . The sink for the copy activities is ADLS Gen 1, so it also sets an expiry datetime for each of the files copied over. It uses a tumbling window trigger with a self-dependency. Recently, the…
Logs are missing in the Log Analytics Workspace - Azure Data Factory starting 10/23/2020
Hello, Our ADF and the log analytic workspace are hosted under West US2, same as our ADF, and since 10/23, we’ve noticed several pipeline runs do not come with the Succeeded or Failed log, but just InProgress or Queued. It threw off our query since…
Data Factory Loading multiple csv files. Is there any way of dealing with column header differences?
I have csvs in a datalake. Im coping them into SQL using the copy activity the File Path type is a wildcard file path Very occasionally there are slight differences between the column headers. Is there any way that this can be dealt with rather…
Getting TlsVersionNotPermitted error when SSIS package executing from ADF
Hi, I have created one SSIS package which is reading data from Storage account and putting that into Azure DB. This is working perfectly well when executing from Visual Studio. But when I am triggering it after deployment in ADF, getting below error. …
Query Azure Function HTTP Get from Azure Data Factory
I have a function like: public static async Task<IActionResult> Run( [HttpTrigger(AuthorizationLevel.Function, "get", Route = null)] HttpRequest req, ILogger log) { string dateFromString =…
Azure data factory web activity body
How do I use parameters in the body of Push method in data factory? It seems to allow static values or a parameter but not both. Say I want my message body as {"count" :@Anonymous } But this isn't working, any ideas
Is it possible to rename file name with MD5 hash encoding in Azure data factory?
I have this scenario where I have a blob with multiple binary files and I would like to copy these files into another blob storage using Azure data factory. The issue is my destination file name should be MD5 encoded . Now I would like to know if it…