How to create sha256 signature for Oracle Cloud integration Rest API ?
Hi Experts, Infra: Azure Synapse -we are using pipelines / API method to get data and create a Data Warehouse. Source: We have an Oracle Cloud Infrastructure (OCI ) as source , that will output the report to Oracle Cloud Storage. OCI has a REST API…
When will Workflow Orchestration Manager be updated to newer Airflow versions?
The WOM has been stuck at Airflow Version 2.6.3 for a while now. The current upstream Airflow version is 2.9.2. Is there any (public) timeline on when newer versions will be available? Is there a general release schedule?
How can I add pagination as a parameter for the Azure Data Factory Rest API call to retrieve all data?
How can I add pagination(Limit and Offset ) as a parameter for the Pipelines_ListByFactory API call? https://video2.skills-academy.com/en-us/rest/api/datafactory/pipelines/list-by-factory?view=rest-datafactory-2018-06-01&tabs=HTTP I have 65 pipelines in…
How to fix the copy activity error while copying data from databricks delta table to datalake in csv format
There are some error tables in Databricks delta table . Those tables need to extracted as csv and load in azure data lake , inside the folder of the container. Staging has been enabled in the copy activity since it is 2 step process. Approx row count of…
Dataflow and copy activity upsert behavior
Hi, We are using the Azure Synapse to copy data from our datalakehouse (adls gen2 storage) to Dataverse tables. We're using the Dynamics 365 Linked service. To do this we are making use of a combination of data flows and copy steps, using 'upsert' as our…
Lookup in adf
Hi, How to use look up table columns in copy activity and can we write sql in lookup any screenshot appreciated
Function in adf
Hi what are the functions in adf Pls help me with screenshots
ADF-Global parameters inside Linked Service
Hi all! My company is starting to make use of DevOps pipeline to share the same code in all the productive and not productive envirorments. Before this, everything was wild. For example, the linked service for the Azure KeyVault has the url-link written…
![](https://techprofile.blob.core.windows.net/images/hAFzqf2_AwAAAAAAAAAAAA.png?8DB62B)
Azure Data Factory Copy Activity convert sql string in to jobject
I have a query from Azure SQL database which returns jsonstring like below and want to insert in to Cosmos db as JSON object (not string) "{\"type\":…
Copy activity failed: Failure happened on 'Source' side
Trying to get data from PGSQL to blob using copy activity give below error after query runs for ~30 min Operation on target Copy Customer failed: Failure happened on 'Source' side.…
Integration runtime setup, Manual setup: Download and install integration runtime Page not found
I am writing to seek assistance regarding two issues I am encountering with Azure Data Factory Integration Runtime setup. Express setup Internal Server Error: Whenever I attempt to select "Option 2: Manual setup" and proceed to "Step…
![](https://techprofile.blob.core.windows.net/images/VfQFAmOikEWfBHko2XlWTA.png?8D7F33)
Upsert operation in azure data factory
We want to copy data from production and send it to dev env daily. As a solution, we re thinking to create a pipeline in Azure data factory to fetch all data from source database, scramble PII information for a few columns and send it to a destination…
How does ADF Linked Service for Azure Databricks aligns with Job Compute Policy defined inside Azure Databricks
This is a two-part Question. First Part Context: I have ADF which contains several data pipelines. Some pipeline also includes a databricks notebook as an activity. I have created Linked Service Azure Data Factory to facilitate the pipeline which…
RFC table options
Hi, i´m using azure data factory and copy step to copy data from sap. To shrink data volume i´ve tried to use the rfc table option in combination with a pipeline variable. This fails with sap error message, that it cannot parsed, when i hardcode the…
For each count dynamic batch count
I have a scenario in which i need to have provide dynamic batch count in for each loop f. Is there any way to do it. RIghtnow , I don't see any option to give it. We need to give it as hardcoded in pipeline.
how to read excel file data (stored in adls gen2 ) in Azure Data Factory pipeline
I have source excel file which is coming as showing in the format . Please help how to read excel file with that specified format . Please help . Thank You.
ADF Debug pipeline : Use activity runtime
What is the difference between 'Use data flow debug session' and 'Use activity runtime' in debug in Azure Data Factory? I have a pipeline with a Lookup activity followed by a ForEach activity. A lookup file is used to store parameters, and this file is…
adf data flow acitivity stuck in queued status when the pipelines run
when I trigger the pipelines, it got stuck in the data flow activity as queued status
Data Factory's Data Flow cost structure
Hi all, I have a question regarding the cost structure for Data Factory's Data Flow. Is the charge solely per cluster/integration runtime, or are the costs doubled if two pipelines run simultaneously, I mean, both using the same cluster/integration…
How do i refrence my dataset parameter in my base url in azure copy rest activity
Good day, Is there a way i can reference my dataset parameter in my rest copy activity on the base url, am trying to create a dynamic url: