Data Factory Time Discrepancy: Seeking Explanation
In Data Factory, I have a total transfer time of 23 minutes, but the sum of reading and writing time is much lower. My reading time is 4:48 and writing time is 7:12. Can someone help me understand this discrepancy ?
I have an ADLS Gen2 with hierarchical, its not allowing for failover
I have created an ADLS Gen2 with hierarchical and its not allowing for failover, I need to do failover testing
How to protect sensitive data in Azure?
I would like to load sensitive data in an Azure Data Lake Storage Gen2. I need to make sure that this data can not be read by the global administrator or any other kind of super user. How can this be realized? I think role-based access control is not…
Unable to use Synapse workspace features in Synapse Link for Dataverse when Synapse network is in private mode
Hi Team, We are using Synapse Link for Dataverse for syncing data from d365 to ADLS Gen2, which can be used for analytics in ADLS Gen2. As our Synapse analytics environment network is publically disabled and using private end points, we cannot enable…
How to migrate data from Azure databases to the container of storage accounts?
hi friends, i want to migrate data from two databases of azure synapse analytics to my ADLS2 container.What is a simple and efficient way? Thanks
![](https://techprofile.blob.core.windows.net/images/nU6nYqttiUqkasJ2PMYbbQ.png?8DB643)
Failure happened on 'destination' side. ErrorCode=DeltaInvalidCharacterInColumnName.
This exception occurred when I use the pipeline of data factory to copy data from sql server to lakehouse. But I didn't find any problems with the raw data.
Azure Synapse Link for dataverse to bring Dynamics 365 data to ADLS and Synapse for analytical purpose - Facing Issues and inconsistencies in the feature
We are using Synapse Link for Dataverse feature to bring dynamics 365 data to ADLS and Synapse Analytics. We have below open issues due to which we are unable to finalize the solution: In F&O linked dataverse environment, we have created a synapse…
This request is not authorized to perform this operation using this permission.", 403, HEAD Synapse connect to adls
I am trying to select data from ADLS Gen2 storage delta table and keep receiving this error. I added the synapse service principal as storage blob data contributor and ACLs to container with no luck. Firewall is set to enable all networks as well. Please…
Storage Account Limitations
Can anyone clarify the following Storage account questions: How many Storage accounts can be created? Microsoft's documentation indicated an increased quota for 250 storage accounts, with a max of 500 storage accounts. Is it possible to have…
How to copy files from sharepoint drive that has huge heirarcy of folders and subfolders
We want to get all files from the sharepoint into our ADLS container. In sharepoint drive there are arround 60k files but through our existing pipeline we are getting only 600 files. We have a web activity with below…
How to copy files from storage account to another storage account with the same last modified date using ADF
Hi Team, The request is that, when we are running ADF copy job to copy data from Landing storage account to Prod Storage account the last modified date coming as present copy time, but we want to copy the last modified date, what we have in landing…
Is it possible to tranfer files from logic app to SAP AL11 directory
Hi Experts, Is it possible to transfer the files from logic app to SAP AL11 directory. Thanks, Satish Pattanayak
Previous version of Updated file in Azure Datalake with Hierarchical namespace (Container)
How can we get previous version of uploaded/modified files when my storage is Datalake with Hierarchical namespace enabled. Could you please suggest how to get previous version either by Versioning or Snapshots or last option backup and restore? All the…
Have previously been able to map a received dataset to data store. But have been unable to do so today - OK button does not become enabled after selecting a data store folder. Has something changed?
I received some (Azure Data Lake Storage Gen2 Folder) datasets from an outside (partner) organization through a data share invite. Previously was able to map the dataset to a folder in one of my (Azure Blob Storage) data stores, but today the OK button…
How to setup modern Arcitechure for Small/Medium Business?
Currently we're using the following setup which is slow to process the data and is slow on the power bi side: Azure VM for third parties to upload via sftp C# script to ETL data to azure sql server and move files to ADLS Gen2 Power BI report pulling…
While running databricks migrate pipeline facing an issue with invalid configuration for storage account key
Hello Team, I am trying to run the yaml pipeline for azure databricks migration from Non UC workspace to UC workspace for reference this is the rep https://github.com/databrickslabs/migrate so while exporting the hive metastore, I am running into error…
Authentication failed when accessing dataset in compute instance using its MSI via AzureML SDK v2
The dataset can be previewed in AML workspace when the associated datastore is created using identity-based auth with workspace managed identity. I have the same user managed identity assigned to workspace and compute instance. But it only failed in…
![](https://techprofile.blob.core.windows.net/images/V__cEKU3BECjeOIWYXr7tA.png?8D9261)
![](https://techprofile.blob.core.windows.net/images/UVyLn-Xlr0apNce5TRX1RA.png?8DBA3B)
Data Lake as a storage/database for Express Angular Application?
So currently I'm using SQL server for our structured data, client uploads a file which has a minimum of a million of records, which gets uploaded to blob storage and then those million of records gets inserted into different tables in SQL. What I want to…
Secured kusto connection during data ingestion from fabric notebook to lakehouse.
Hi team Im looking for SN+I Auth or secured auth for kusto in fabric notebook.
![](https://techprofile.blob.core.windows.net/images/Z1PCM1zxm0SLa41PVP7B7g.png?8DA865)
Self-hosted machine unable to access Data Lake Storage account when running a pipeline using Synapse
Hello - I need your help again: Here's the story: I have azure synapse workspace I have created managed private end point created for ADLS - working fine I have create private endpoint created for ADLS - working fine ADLS have set the public access…
![](https://techprofile.blob.core.windows.net/images/VfQFAmOikEWfBHko2XlWTA.png?8D7F33)