How do I assign managed identity to a databricks resource?
I have created a linked service from ADF to Databricks cluster, following https://techcommunity.microsoft.com/t5/azure-data-factory-blog/azure-databricks-activities-now-support-managed-identity/ba-p/1922818 I have provided contributor access to the ADF…
![](https://techprofile.blob.core.windows.net/images/VfQFAmOikEWfBHko2XlWTA.png?8D7F33)
"Premium Automated Serverless Compute - Promo DBU" expenses arise from what, how can I disable it, and why are the costs so high?
"Premium Automated Serverless Compute - Promo DBU" expenses arise from what, how can I disable it, and why are the costs so high? detail in below
![](https://techprofile.blob.core.windows.net/images/VfQFAmOikEWfBHko2XlWTA.png?8D7F33)
Azure Databricks exercise error
Keep receiving the error "No such file or directory /your_correct_source_value/wikipedia/pagecounts/staging_parquet_en_only_clean" When I checked Wikipedia, it appears this dataset has been deprecated since 2016-08-01 Could a new dataset be…
![](https://techprofile.blob.core.windows.net/images/VfQFAmOikEWfBHko2XlWTA.png?8D7F33)
Issues while writing into bad_records path in Databricks
Hello All, I would like to get your inputs with a scenario that I see while writing into the bad_records file. I am reading a ‘Ԓ’ delimited CSV file based on a schema that I have already defined. I have enabled error handling while reading the file to…
Best option to implement departmental secrecy in Azure databricks resource?
I have an azure databricks resource created in my Azure portal. I want to achieve departmental secracy in a single existing databricks resource. Hence, I am looking for a solution where I can add multiple workspaces to my single Databricks resource. How…
How can i connect Azure Databricks to Neo4j??
Hello, I want to connect to neo4j from Azure Databricks. What are the different approaches do I have? I am trying to connect here and i getting following error. Do I need to do anything before running the code? i mean setup managed identity or enable…
![](https://techprofile.blob.core.windows.net/images/VfQFAmOikEWfBHko2XlWTA.png?8D7F33)
org.apache.hadoop.fs.FileAlreadyExistsException: Failed to rename temp file
[Repeat Question due to old thread] We have built a streaming pipeline with spark autoloader. Source Folder is a azure blob container. We've encountered a rare issue (could not replicate it). Below is the exception…
![](https://techprofile.blob.core.windows.net/images/e3KmMP62FEeXRBX1Jzhdpg.png?8DB239)
![](https://techprofile.blob.core.windows.net/images/e3KmMP62FEeXRBX1Jzhdpg.png?8DB239)
Run Databricks notebook from ADF - error to find azure module to save the data in blob storage
Hi Guys, The requirement is - Call Rest API, read the records in jsonlines format and load into table in Azure SQL server. I used Databricks to read the jsonlines from Open API using Python script. It can read and keep the data into a file in Azure blob…
When creating a second external location to the same path in Azure Databricks Unity Catalog it gives conflicting error for path. Is there any way to solve this?
Hello Team, When creating a second external location/external volumes to the same path with different folder or to the root location gives an error see below for details in Azure Databricks Unity Catalog as it gives conflict error for path. Is there any…
![](https://techprofile.blob.core.windows.net/images/VfQFAmOikEWfBHko2XlWTA.png?8D7F33)
The scim API is by default adding users to admins group in azure databricks
Hi, When we are invoking scim API in azure databricks it is by default adding users to the admins group and also after deleting users from only admins group they are being created again. Also calling scim API with adding groups as users also adding them…
![](https://techprofile.blob.core.windows.net/images/VfQFAmOikEWfBHko2XlWTA.png?8D7F33)
Connect to Blob storage from Azure Databricks SQL
So I would like to read a table from a CSV file on Azure Blob Storage in my own account, and load it into a table in Unity Catalog on databricks (hopefully using SQL). I have tried this SQL command: CREATE TABLE IF NOT EXISTS <table_name>; COPY…
![](https://techprofile.blob.core.windows.net/images/VfQFAmOikEWfBHko2XlWTA.png?8D7F33)
What Azure service should I use to deploy a complex Python program on the cloud?
Background<br> I have developed a Python program that fetches data from three different REST APIs, processes it, and inserts it into a database. The program also queries the database to identify which values to fetch from the APIs, so there is…
![](https://techprofile.blob.core.windows.net/images/VfQFAmOikEWfBHko2XlWTA.png?8D7F33)
deploying Azure databrick with datalake
Deploying Azure Databricks creates an additional resource group in the background, which includes a data lake. Is it possible to use the data lake that I have already deployed in Azure instead of the one provisioned by Azure Databricks?
![](https://techprofile.blob.core.windows.net/images/VfQFAmOikEWfBHko2XlWTA.png?8D7F33)
Integrating Databricks notebooks in Azure ML using SDK V2
Hi all, We currently have some Azure Databricks notebooks in production which we would like to integrate in Azure ML using the v2 SDK. I found resources to integrate these notebooks using the databricks_step in the v1 SDK. The official documentation…
![](https://techprofile.blob.core.windows.net/images/UVyLn-Xlr0apNce5TRX1RA.png?8DBA3B)
Databricks Dev/Prod setup
We are a data team of 4 people. To make the process easy and more productive. Can we separate dev/prod environments at Databricks catalogue level rather than the workspace level? Can anyone share any thoughts on this? Thanks
My Dev, test, prod environments are in different resource groups of same subscription. How do I create a devops pipeline in this case?a DevOps pipeline to deploy a
Hi, My dev, test and prod environments are in different resource groups of the same subscription. I am involved in a data engineering project where I will be using primarily below resources - ADLS - data storage ADF - Orchestration Azure Databricks - QC…
How to ignore the records in ADF Data Flows
Hi All I am building a data transamination using mapping data flows ,I have a time stamp field Like TimeStampUpdated in the target table. I want to lockup historical data with incremental data transamination and ignore the records coming in the…
![](https://techprofile.blob.core.windows.net/images/VfQFAmOikEWfBHko2XlWTA.png?8D7F33)
Access issue with app registration
I've created a Databricks workspace and a new notebook, but I don't have access to the secret keys under app registration, which are disabled for me. How can I solve this issue? Warning message You do not have access Your administrator has disabled the…
![](https://techprofile.blob.core.windows.net/images/VfQFAmOikEWfBHko2XlWTA.png?8D7F33)
Access to C:\Data not allowed . Error Code 22853
Access to C:\Data not allowed . Error Code 22853 Any workway around this ?
How to Create Delta Table in Azure Synapse Analytics with Id Auto Increment Identity Column ?
I have created the Delta Lake Delta tables In ADLS using Synapse Notebook and in that table, I want to add an identity column (Auto increment 1,1) but I am not able to create the same, Below is my Create table script and error which i am facing. Table…
![](https://techprofile.blob.core.windows.net/images/L0U8vpcmgEqD99wEkmZO_w.png?8DADCF)
![](https://techprofile.blob.core.windows.net/images/L0U8vpcmgEqD99wEkmZO_w.png?8DADCF)