Index Out of Range Error When Creating SnowFlake Linked Service
I have tried several times to create a SnowFlake linked service. Every time I try I receive the same error. The portal offers no other information on what is happening. Found someone else on SO having the same issue. Can anyone help with this? The…
Get Status of Pipeline triggered
Hi All, we have a below scenario wherein we are triggering ADF piplelines via another ADF pipeline through web activity : url":…
Is there a way to handle SQL insert errors in copy activities within a pipelines?
Say when I am copying a sql table from one database to another (azure sql). I have 1000 rows, but one fails with a SQL exception on insert. Is there a way to handle these errors so 999 rows are inserted and the other one is saved in logs or azure…
To create a Pipeline that generates ‘incremental’ folder names
I have the following task: I have a Pipeline in an Azure Data factory (V2). The Pipeline contains several Copy data elements. Each of them copies data from an SQL database into an FTP folder. As a result, the Pipeline generates files like…
Access Azure DataLake G2 from Azure subscription
Hello I 'm deploying an ETL scenario to move data from SQL server to Azure DataLake G2. When starting the process, I can see that specifying Azure DataLake G2 as target is defining Azure Blob Storage as target. Is this the right procedure ? …
Data Factory cant see the Stored Procedure in my Azure SQL database
In SQL I have created a user and rule with full access to my schemas -- Create the user with Password for users that authenticate at the database CREATE USER DBowner WITH PASSWORD = 'pwrd'; --Im going to create a role and I want that rule to have…
File system Linked Service User Name and password
I am trying to move On premises files to an azure data lake using the File System Linked Service as the source My Files are in One drive and I have set up Self Hosted Integration run time (Hopefully) My first Question is, if Its One Drive can I still…
Azure data factory concatenation of columns
Hi, for one of the requirement in azure data factory we need to derive a primary key column based on the parameter p_primary_key_cols Input data: int_col|varchar_col|date_col 10|xyz|2020-02-13 20|abc|2020-03-12 pipeline…
Azure Data Factory: precision and scale as parameters to toDecimal() Function
Hi, providing the precision and scale as parameters to toDecimal() function in Azure data factory is giving "Should be integer literal error". please provide a way to resolve this issue. Thanks, Saikumar
In spite of having an azure runtime allocated, each activity has a separate "AcquiringCompute" step?
Question: In our pipeline, we have around 10 mapping data flow activities, in serial fashion (one after another). Each of them are configured to use the same integration runtime (azure managed runtime). In spite of this configuration, each activity shows…
To create a Pipeline that generates ‘incremental’ folder names.
I have the following task: I have a Pipeline in an Azure Data factory (V2). The Pipeline contains several Copy data elements. Each of them copies data from an SQL database into an FTP folder. As a result, the Pipeline generates files like…
Cannot connect to SQL database (ADF) - Pipeline -> DataFlow -> Sink
Hi Team, {"message":"at Sink 'XXXXXXXXXXSink': java.lang.RuntimeException: Cannot connect to SQL database: 'jdbc:sqlserver://XXXXXXXXXX.database.windows.net;database=XXXXXXXXXX', 'User: XXXXXXXXXX'. Please check the linked service…
Key vault Azure data factory problem
Hello I've created a Key vault service to store a secret with the credentials to access to my cosmos DB collection. the secret value is as i show you : …
Mapping data flow design question
Hi all- I am a long time .net developer, doing some adf work. I am trying to design a data flow in which data from an azure sql database table is transformed and inserted into another azure sql database table. If the record is successfully inserted, I…
ADF Copy Issue - Long File Path names
Hello, I am using copy activity to copy the files from on premise to Azure Storage Account . The job is failing as there is a long file path name at the source. can someone help me how can i fix this issue I am getting the following error …
Azure Data Factory Linked Service to On-Premises SQL Server with named instance
Hello, I'm trying to connect from Azure Data Factory to an on-premises SQL Server using the SQL Server linked service (e.g. servername\instance1). I already have an integration runtime installed, configured, and working (tested with other servers). …
Pipeline fails at data flow
I have created a 'copy data' and 'data flow' and have added them to my pipeline. When debugging the pipeline, copy data runs successfully, however, it fails at data flow with error: {"message":"at Sink 'OutputStreamName':…
Azure data factory copy activity cosmos db error
Hello I'm getting the following mistake when I try to make a mapping of an array on copy data activity from csv file to cosmos db. The structure on cosmos db for the item should be as follows: "InsuredItems": [ { "Accomodation":…
REST API JSON DATE FORMAT
I am trying to copy data from rest api source using the azure copy activity. I have used Rest Api as source and AZ SQL DB as target. But the json response I am receiving is having the date as below format: {Createddate: /date(345667999)/} But when I…
How to insert data in Azure Synapse Analytics through Azure Data factory without PolyBase
I have tried using Move and Transfar activity in data factory but I'm getting 2 error at publishing: ' Data flow activity 'dataflow1' requires a stagingLinkedService Data flow activity 'dataflow1' requires a stagingStoragePath ' I don't want to…