Data Factory Salesforce objects not the same as Objects seen in Salesforce URL
I have created a LinkedService to connect to the company supplied Salesforce URL using the customised url, the supplied user id, password and have acquired the correct security token. This connection tests successfully. It is also the same parameters…
Spark Connector in ADF
Hi, I have created a spark connector to connect to azure data bricks. In copy activity source is spark connector and sink is Azure SQL DB. In spark Connector query, CreatedDate is being converted to String and throwing error where as it is timestamp…
Copy Activity data socurce preview error
Hi Team, Currently i have setup some tables move to dataLake using copy Activity pipeline, However i have realized few of tales have a issue when i connect it to copy activity source, According to error pop up as a below, Could you please check and…
How to checke pipeline cpu usage
Hi team , I have some pipelines. Whenever some pipelines run at that time cpu usage goes beyond 80. I have checked for cpu usage in task manager it is showing self-hosted ir process is running there which is taking too much cpu.. How can I determine…
Error while creating data factory
I have free subscription account and i m trying to create data factory using sandbox and getting error 'deny' Policy action.
Data Factory Variable Comparison not working as expected.
Hi All, I'm running a simple pipeline that takes an input variable, adds 1 day to it, and compares it to another variable. However, the comparison isn't working as I'm expecting it to. Variable boolTest is set to false after the update of the…
merge rows of same file Azure data factory
Hello I need to merge multiple rows into one row in azure data factory, per example I have the following file PolicyId Driver_name 0001 Adam 0001 Lucy 0002 peter At the end I need to have PolicyId Driver_name1 Driver name2 …
How to pull the data from Dynamic 365 to Data lake
<p>Hi <a rel="user" nodeId="74270" href="/answers/users/74270/chiragmishramsft-1092.html">@ChiragMishraMSFT-1092</a> </p> <p>Can you pls help me with the below error:</p> <p>{…
ADF - export to flat file with multiple record types
How to export to flat file with multiple record types in Azure Data Factory from JSON Example output file: Header, 1, H1 Detail1, 1, D11, Test1, 10 Detail1, 2, D12, Test2, 20 Detail2, 1, D21, 100 Detail2, 2, D22, 200 Trailer, 1, T1, 2,…
ADFV2 Copy Activity Support for more encodingNames
HI I am using the Copyactivity which generate the delimiter text files reading from parquet and applying the respective encodings for the files, we see a list of encoding which are supported by Copy activity (Delimited text format ), and still some…
Azure Data Factory add Tags in json files
Hello, In the Portal it is possible to add Tags to a Azure Data Factory (I'm not talking about the Annotation in pipelines). But I can't find a way to include them in the ARM Template. I just created a blank ADF, added a Tag and exported the ARM…
Data flow error: 4501 - related to WranglingDataFlow
Hi guys, Please, for some reason, I'm getting this error message when I try to debug a pipeline with a wrangling data flow activity. { "errorCode": "4501", "message": "Failed to fetch…
JSON file Node didnt loade in SQL server tables using Azure data factory
HI Team, Below is MY JSON file for SOurce(Blob storage JSON files) I have loading Sql server table using Azure data factory(Copy data activity) in the Mapping tab they Node checkbox I have selected resolutions Checkbox data loaded only…
Log shipping
Dear experts I have two node SQL server standard failover cluster. I need to setup single node DR using log Shipping method. Want to ask after switch over to DR how we can setup, plan so that application can connect automatically to DR database…
Azure Data Factory Dynamic Content
Hi there, I want to use a variable inside my Data factory Copy activity. I trying to use a variable which I got from an Azure Function. When I run the debug for the variable I can see that I'm getting the value but I can't access it as a…
Azure Data Factory Data Copy from Cosmos
We have one scenario, where we have to transfer/copy newly added/updated data from cosmos dB to a SQL Server every 2 hours. Currently, data copy copies all the files, can I configure to only copy the newly added/updated ? Thanks
Data feild comma data impact to csv file creation issue
Hi Team, Currently i have setup a copy activity for a SQL query output .csv file create on the Data Lake destination, But in the one table address level details there have a comma (,) on the inside of the data fields, Due to the that reason, created…
Doubt with new Azure IR
Hi there. I know this is maybe a trivial question for pretty much all of you, but I am confused with this. To reduce the Pipeline runtime, the administrator has created a new Azure IR. I understand that I need to change the setting on its…
data bricks scala : data frame column endoing from UTF 8 to windows 1252
HI I am working with data bricks where i have the data in parque and i am generating smaller files out of it , i have a column in this which is string and it has different characters and i have to encode this string value to windows 1252 or windows…
how to replicate ongoing changes of on prem database to data lake?
Like AWS DMS handles ongoing replication and update bucket with respective files, what is available in azure to manage ongoing changes occurring at on-prem sql database? Initially it requires to do a full load in my data lake storage which can be…