When creating a second external location to the same path in Azure Databricks Unity Catalog it gives conflicting error for path. Is there any way to solve this?

Ashwini Gaikwad 110 Reputation points
2024-06-04T10:23:22.8733333+00:00

Hello Team,

When creating a second external location/external volumes to the same path with different folder or to the root location gives an error see below for details in Azure Databricks Unity Catalog as it gives conflict error for path. Is there any way to solve this?

Input path url 'abfss://containername@storageaccount.dfs.core.windows.net/operations_layer' overlaps with other external tables or volumes within 'CreateVolume' call. Conflicting tables/volumes: test_d.external_volume_schema.operations_layer_config.

Whereas Volume I am trying to create is mapped to location : abfss://container@storageaccount.dfs.core.windows.net/operations_layer/

And existing volume which is already mapped to location is: abfss://container@storageaccount.dfs.core.windows.net/operations_layer/config

Please let me know if there is a way to create the external volumes/location as explained above and if not what is the desirable solution?

Regards,

Ashwini Gaikwad

Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,045 questions
0 comments No comments
{count} votes

Accepted answer
  1. PRADEEPCHEEKATLA-MSFT 84,381 Reputation points Microsoft Employee
    2024-06-04T10:39:00.69+00:00

    @Ashwini Gaikwad - Thanks for the question and using MS Q&A platform.

    Azure Databricks recommends never creating an external volume or external table at the root of an external location. Instead, create external volumes and external tables in sub-directories within an external location. These recommendations should help avoid accidentally overlapping paths. See Paths for Unity Catalog objects cannot overlap.

    Root cause: The table location overlaps with other external tables. For UC Metastore, two tables cannot be created on the same location, even across different catalogs. Databricks has explicitly coupled external storage Paths to individual External Tables on a strict 1:1 basis across the entire Metastore.

    Solution: The table location overlaps with other external tables. Use a different location for the table or remove the overlapping external tables.

    To resolve this issue, you can try creating a new folder within the existing external location and then map the new volume to that folder. For example, you can create a new folder called 'config' within the existing external location 'abfss://container@storageaccount.dfs.core.windows.net/operations_layer/' and then map the new volume to that folder.

    Alternatively, you can create a new external location with a different path and then map the new volume to that location. For example, you can create a new external location 'abfss://container@storageaccount.dfs.core.windows.net/operations_layer_new/' and then map the new volume to that location

    Here list some common status codes along with the recommendations to address them for UC upgrade. Common status codes returned by SYNC.

    Hope this helps. Do let us know if you any further queries.


    If this answers your query, do click Accept Answer and Yes for was this answer helpful. And, if you have any further query do let us know.


1 additional answer

Sort by: Most helpful
  1. Amira Bedhiafi 18,501 Reputation points
    2024-06-04T12:35:30.6466667+00:00

    In your case, both paths are part of the same directory structure, that's why you are facing the conflict.

    You can create separate containers or directories for different external locations/volumes and you will ensure in this case that paths do not overlap.

    • Existing volume: abfss://container@storageaccount.dfs.core.windows.net/operations_layer/config
    • New volume: abfss://container@storageaccount.dfs.core.windows.net/operations_layer_new

    If you must use the same container, ensure that the folder structures do not overlap:

    • Existing volume: abfss://container@storageaccount.dfs.core.windows.net/operations_layer/config
    • New volume: abfss://container@storageaccount.dfs.core.windows.net/operations_layer_new/config