Databricks cluster does not work with free trial subscription
In Azure Databricks, tried to create cluster. Cluster creation requires to provide worker type and driver type, in which minimum 4 cores are required on each which means in total 8 cores. Free trail subscription gives only 4 cores what I read. So how…
![](https://techprofile.blob.core.windows.net/images/VfQFAmOikEWfBHko2XlWTA.png?8D7F33)
dbutils.widgets.dropdown() issue: com.databricks.dbutils_v1.TooManyDefaultChoices: Too many default choices (1116). Limit is 1024
HI, While running dbutils.widgets.dropdown(), I face the following issue: com.databricks.dbutils_v1.TooManyDefaultChoices: Too many default choices (1116). Limit is 1024. I searched but could not find an answer, how to change the default value. For sure,…
Mounting entire ADLS on azure databricks
Hi, I want to mount entire ADLS storage account on databricks. I've checked in documents I can mount a single filesystem at a time but I want to mount entire ADLS on databricks. I've around 70 containers in my ADLS and I want to mount all of them…
remainingCores is too small error. Spark and Kafka cluster on HDInsight problem
I tried to create a Kafka and Spark cluster in the HDInsight using either of templates in these links: …
![](https://techprofile.blob.core.windows.net/images/FoAX95-MAES64Lb9dlou4w.png?8DC6A6)
![](https://techprofile.blob.core.windows.net/images/VfQFAmOikEWfBHko2XlWTA.png?8D7F33)
Can i access Gen1 and Gen 2 datalake sources from the same databricks cluster
If I have access to Gen 1 datalake sources on databricks cluster. Can I add Gen2 sources on the same databricks cluster and access them both Gen 1 and Gen 2 from the same databricks cluster Or Do i have to create separate cluster for connecting to gen…
Calling Azure Databricks query from Azure Function
Hi Everyone, Is possible to call a SQL query in Azure Databricks(Cluster) inside a Azure Function ? Normally to conenct SQL server from Azure function, we use to create connection string in application seetings and use to retrive using environmnet…
Is it possible to add a Databricks Role/Group to appRoles for OAuth 2.0(Azure AD) tokens
I have configured a service principal for Azure Databricks in Azure AD as per this documentation: https://video2.skills-academy.com/en-us/azure/databricks/dev-tools/api/latest/aad/service-prin-aad-token I am able to access Azure Databricks using an OAuth…
Alternative to using Azure SQL Managed Instance for Complex calculations
I have inherited an Architecture which carries out complex calculation in Azure SQL Managed Instance via Databricks. The Databricks is connected via Apache Spark connector. The Databricks does the calculations in SQL and the end results are in Tables…
Databricks certification in Azure
Hello team, I am planning to do certification in Databricks.So I just wanted to know is there any certificate on databricks on Azure if yes please advise the exam number otherwise which one i can get certified. Please advise Thanks in advance …
![](https://techprofile.blob.core.windows.net/images/2fooggAAAwAAAAAAAAAAAA.png?8DC1D0)
Not able to delete any resource
Please help me regarding this error. Not able to delete resource group.
Apache Spark Event Hubs connector OAuth authentication - is it supported?
Does anyone has an example of reading streams from Azure EventHub using OAuth authentication in Databricks? I find only examples which use SAS keys and to be honest I am sure whether this is officially supported.
![](https://techprofile.blob.core.windows.net/images/VfQFAmOikEWfBHko2XlWTA.png?8D7F33)
Security Recommendations for Azure Data and Analytics Services
I am working on Securing Data and Analytics Services on Azure. I want to know what security controls i can apply after creating of services and what i can apply only during the service creation. Below are the recommendation i have found as of now. Could…
![](https://techprofile.blob.core.windows.net/images/VfQFAmOikEWfBHko2XlWTA.png?8D7F33)
how add class library for BeautifulSoup in Azure Databricks
hi all, how add class library for BeautifulSoup in Azure Data-bricks i want to run below code in pyspark notebook from bs4 import BeautifulSoup import pandas as pd table = BeautifulSoup(open('C:/age0.html','r').read()).find('table') df =…
![](https://techprofile.blob.core.windows.net/images/VfQFAmOikEWfBHko2XlWTA.png?8D7F33)
Want to Read Microsoft Team Post in Azure Data Factory
I Have requirement to read Microsoft Team post in Azure Data Factory. currently i am using logic apps , but it generate output in html file and it is difficult to read what is the best way to get data from Microsoft team, so we can read data in…
![](https://techprofile.blob.core.windows.net/images/j_hATN6lWUGQqy8lwG64fA.png?8D869F)
Is there any other free streaming sources like twitter for practicing spark and Kafka
Friends I am learning Kafka and Spark. I worked in Kafka and spark integration using Twitter api but I want to do more practice Is there any other free streaming sources like twitter for practicing spark and Kafka
![](https://techprofile.blob.core.windows.net/images/VfQFAmOikEWfBHko2XlWTA.png?8D7F33)
Using pyspark dataframe input insert data into a table
Hello, I am working on inserting data into a SQL Server table dbo.Employee when I use the below pyspark code run into error: org.apache.spark.sql.AnalysisException: Table or view not found: dbo.Employee;. The table exists but not being able to insert…
Write json document to azure table
Hi, I am using below code to write json document in a Azure Data Lake Gen2 container into a SQL Server table. Code: df =…
![](https://techprofile.blob.core.windows.net/images/VfQFAmOikEWfBHko2XlWTA.png?8D7F33)
how to dynamically explode array type column in pyspark or scala
HI, i have a parquet file with complex column types with nested structs and arrays. I am using the scrpit from below link to flatten my parquet file. https://video2.skills-academy.com/en-us/azure/synapse-analytics/how-to-analyze-complex-schema …
![](https://techprofile.blob.core.windows.net/images/VfQFAmOikEWfBHko2XlWTA.png?8D7F33)
how to move compressed parquet file using adf or databricks
hi, i have a requirement to move parquet files from aws s3 into azure then convert to csv using adf. i tried to download that few files on to my local file system and tried to copy via copy activity within adf. The files are in this format …
![](https://techprofile.blob.core.windows.net/images/VfQFAmOikEWfBHko2XlWTA.png?8D7F33)
How to Transform files in subfolders with one script in databricks
i have a adls gen2 folder with sub folders with parquet files in each folder. My requirement is to transform all parquet files in sub folders and load into another folder in adls gen 2 with same folder structure with one script. is it possible to do or…