Suggest a way to replicate data from on premise sql server 2019 to azure synaspe analytics directly
Im using an sql server in my on premise with version 2019, I need to transfer or replicate real time data to azure synapse database, so which azure service will be more effective and flexible for large data transfer
I have a Z that appears at the end of my date field
Hello, I'm trying to retrieve all the data that was modified yesterday on my ERP via an API that is accessible in Odata. However, I'm having an issue with my date field, as I get the character Z appearing at the end of the variable. I tried converting it…
Error "Cannot read properties of undefined (reading '0')"
I'm using Azure Data Factory online via a browser. I'm fairly new to ADF and have yet to complete any ADF tutorial due to various problems I've encountered. I'm doing a tutorial on ADF from Udemy. I'm running a pipeline to copy data from a…
Error 21352 accessing .xlsx file from blob storage in ADF
Hi there, I have a copy activity in ADF that is set to use a .xlsx file in our blob storage as its 'source'. When connecting to the .xlsx file, I can see it in adf just fine: But when trying to select a sheet within the .xlsx file, it gives me an…
Azure Data Factory, Trigger is not working for weekly and monthly schedule but works for Daily Schedule
Create a test azure pipeline, Add a trigger to it with Monthly/Weekly frequency Let the time come for trigger, the trigger won't start the Pipeline. Now shift some time in trigger Manually, the trigger will start working.
Switch from Legacy BigQuery to new BigQuery Linked Service
I have a BigQuery routine that takes start date and end date as input parameter and both input parameter is declared of type String in routine. This routine is working fine without errors. I want to invoke the BigQuery routine from Azure Data Factory.…
Upgrade to new Snowflake Linked Service, buggy with Source=mscorlib
Hi, we recently upgraded our snowflake linked service to the newer version and once we released it to our production environment half of our activities started throwing Failure happened on 'Source' side. 'Type=System.Exception,Message=[Snowflake]…
ADF Data Flow Mapping shows 0 rows written to Dataverse sink
Running a simple set of pipelines with data flow mappings to Dataverse sinks. Data flow runs successfully, but Rows Written through interface consistently shows 0 rows. I can see the rows were written by going to the table in Dataverse.
How to setup an ETL pipeline using DB that cannot be accessed via Azure Functions
I need to create a Python ETL pipeline to update a table used in a Power BI report, but I'm facing two issues. First, one of the ETL inputs is a self-hosted on-premises SQL Server database. I can connect to this SQL database within Azure Data Factory…
How to reference Json fields in copy data (ADF)?
I am currently integrating an API response into a SQL Server database using Azure Data Factory. The response from the API is structured in JSON format, as shown below: The goal is to extract the id values from each batch and insert them into a SQL…
What does ADF SSIS Error "Disk Full Exception Caught" mean
Hi, we are running an Azure-SSIS Type Integration Runtime in Azure Data Factory. This morning this IR suddenly got Unavailable for some reason. Fortunately we hat the Diagnostic Settings ADFSSISIntegrationRuntimeLogs enabled and I found the following…
Why are there two TenantId's returned for an ADFPipelineRun log analytics query?
I am getting duplicate results in my log analytics queries on the ADFPipelineRun table, except for the TenantId column, which has two results instead of one. The __ResourceId's are identical for both rows. I can filter on one of the TenantIds, but why…
Error Rerunning Failed Data Factory Pipeline from Failed Activity with a Secure Output Activity
Hi all, I have a pipeline in Azure Data Factory that is run each day, which is made of several execute pipeline activities. Some of these pipelines have web activities in them which use secure output. The error I am facing is that on occasion, one of…
Data Factory Pipeline source pointing to Azure Data Lake failing
Hello: I have a Source in a Pipeline which points to a DataSet. If I test the DataSet the connection tests fine and I can preview the data. The schema is also being picked up correctly. There is an extra attribute in the first position, called Prop_0. …
New BigQuery connector CopyData error when there's no data
I am trying to replace the Google BigQuery connector for the newer one before the deprecation. But the new Google BigQuery connector seems to make the "Copy Data" task fail when the source is a Query and there's no data returned from it. This…
ADF SSIS Integration Runtime setup issues - Custom DNS
The files needed to setup custom DNS on Azure SSIS integration runtime in AFD, from this article, is no longer available. https://video2.skills-academy.com/en-us/azure/data-factory/azure-ssis-integration-runtime-express-virtual-network-injection#dns Can…
Migrating ADF to Salesforce v2 Dataset, how to get MAX() value of Object's LastModifiedDate
Hello, I'm migrating to the new Salesforce v2 linked service and dataset. We have a dynamic set up to load the delta from Salesforce into our SQL Server DWH every night. We loop over all Objects that we sync, and start off with executing this statement…
Data Factory pipeline that only appends new and updated data
I currently have a pipeline in Data Factory that copies several tables from an Orcale database to an Azure SQL DB. The pipeline runs once per day. What is currently being done is at each run, the database and the tables under it in the Azure SQL DB, are…
Error on sink data to MySQL - communications link failure
I've got a Pipeline with some Data Flow activities that runs in parallel. I've got the error on last night's execution and I'd like some help to understand why and how to prevent it from happenning. Job failed due to reason: at Sink 'EnviaFerias':…
ADF - converting list of lists into a proper JSON format
Hello, I'm pretty new to ADF and I can't wrap my head around one case. To keep it simple: I have a source (REST API response), which theoretically is JSON but it doesn't have a "proper" JSON format. It's supposed to be a simple table but it…