ADF copy activity for REST API POST requiring string value in request body field
I have a ADF copy activity where my source is a REST API POST statement to a report process in one of our data sources. I have the parameters for the report stored in a SQL server table and I am calling it from a pipeline parameter. My expression for the…
How much would it cost to create an ADF Pipeline with two servers
I need to retrieve data from a MySQL backup file on Cyberduck, restore it in MySQL Azure then move it to our data warehouse in SQL Azure, how much would it cost for building the pipeline using Azure ADF.
SQL Bulk Copy failed
SQL Bulk Copy failed due to receive an invalid column length from the bcp client. I am getting this error while copying data from adls to sqlserver. I am copying a csv file. I checked the length of each and every column. Destination has more length when…
Alternative to keyValues Function for Pipeline Expressions in ADF
I'm working with Azure Data Factory (ADF) and I need to create a key-value map within a pipeline expression. While the keyValues function works perfectly in Data Flows, it's not available in pipeline expressions. My scenario involves two separate arrays,…
Error Occurred While Importing Test Cases - Test Case IDs Not Found
Description: We encountered an error while importing test cases into the second Azure DevOps board. The error message received is as follows: Error occurred while importing one or more test cases: Test case(s) IDs not found in the current suite. Please…
To Deploy Azure ARM template which is greater than 4MB from one environment to another environment
Hi, I was trying to move a Azure data factory from DEV to QA using ARM template which has a size of 20 MB. When i try to export the ARM template from DEV data factory and try to import it in QA data factory using Azure custom deployment, It fails to…
How to fix "out of memory exception " while processing a pipeline of around 43 gb of data using copy activity?
I am processing a pipeline of around 43gb of data using copy activity and i am getting the error as : ErrorCode=SystemErrorOutOfMemory,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=A task failed with out of…
Why are so many Workflow Orchestration Manager configurations locked? Especially AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION is annoying.
What is the reason that so many configuration settings are locked, some to non-default values? For example, we would like to turn AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION to True but it is currently locked to False. This greatly hinders our workflow:…
How do you stop/pause or restart a Workflow Orchestration Manager instance?
We use git-sync with our WOM instance. For DAGs, this works fine, changes to DAGs show up very quickly in Airflow. However, some changes require a restart of the instance, for example changes in the plugin (since those are only loaded once at startup).…
Copy CSV files to citrix sharefile path (doman.sharefile.com) folder using ADF
Hello, I am trying to do a POC, which is to build a small pipeline and I am trying to export a SQL table (AZ SQL DB) to a CSV file and need to upload it on citrix sharefile path, site url looks like "https://domain.sharefile.com/home/shared".…
Creating linked service in azure data factory to connect azure storage account
Hi, I have created azure data factory with out "Managed Virtual Network" . Created azure storage account and disabled 'Public network access' in storage account, Both data factory and storage accounts are in same vnet and same subnet. Later, I…
How do you add Airflow requirements to a Workflow Orchestration Manager instance without it crashing?
The Workflow Orchestration Manager has an option to add pypi dependencies. However, when I add for example "apache-airflow-providers-databricks", the instance does not start anymore (it tries to start for ~1 hour and then stops with an…
Is IMDS available on Azure-SSIS VMs?
Is the Azure Instance Metadata Service (IMDS) (169.254.169.254) available in an ADF Azure-SSIS integration runtime? If it isn't available, how else can I get a bearer token? eg. to be able access Azure key vaults. I am looking to find alternatives to not…
![](https://techprofile.blob.core.windows.net/images/VfQFAmOikEWfBHko2XlWTA.png?8D7F33)
I wan use Show billing report by pipeline in ADF, and enabled this via CI/CD
I want enabled the feature "Show billing report by pipeline " in azure Data Factory, but I don't know how can I enabled this via Terraform deployment or ARM template deployment. How Can I do that ?
How to create sha256 signature for Oracle Cloud integration Rest API ?
Hi Experts, Infra: Azure Synapse -we are using pipelines / API method to get data and create a Data Warehouse. Source: We have an Oracle Cloud Infrastructure (OCI ) as source , that will output the report to Oracle Cloud Storage. OCI has a REST API…
Using powershell, how can I set in a data factory a shared integration runtime which exists in a different subscription?
Hi team, I'm looking to configure a shared integration runtime in a ADF which resides in a different subscription were the IR exists. I reviewed…
When will Workflow Orchestration Manager be updated to newer Airflow versions?
The WOM has been stuck at Airflow Version 2.6.3 for a while now. The current upstream Airflow version is 2.9.2. Is there any (public) timeline on when newer versions will be available? Is there a general release schedule?
How can you import data into the Workflow Orchestration Manager Database?
We have been using Airflow on premise for a long time and would like to migrate to WOM without losing our execution history. Is there a way to import data into the Airflow Meta-DB?
IF Condition with 'AND' and 'OR' Operator in ADF Pipeline Expression Builder
I am making an API call in Azure Function and retrieving values for 7 diff attributes. Out of these 7 attributes 6 of the attributes return Boolean value,whereas one attribute Requested Timestamp will either return Null or some timestamp value. I am…
The new snowflake connector is by default altering the session to set multi_statement_count = 1
I was using script activity to execute multiple statements in Snowflake using ADF. The new snowflake connector is not able to run Snowsql statements in one script activity separated by semicolon. In the Warehouse, when I check for query history, below…