copy data between two azure directories

KEERTHANA JAYADEVAN 66 Reputation points
2024-03-13T12:25:20.3133333+00:00

I have data in a directory A/subscription/storage account. I need to have a copy of this in another directory B. How can I do the same?

Azure Storage Accounts
Azure Storage Accounts
Globally unique resources that provide access to data management services and serve as the parent namespace for the services.
3,105 questions
0 comments No comments
{count} votes

4 answers

Sort by: Most helpful
  1. Andreas Baumgarten 108K Reputation points MVP
    2024-03-13T12:51:40.57+00:00

    Hi @KEERTHANA JAYADEVAN ,

    one option to copy data between Azure Storage Accounts could be using the Azure Storage Explorer: Azure Storage Explorer

    A second option might be using AzCopy: Copy blobs between Azure storage accounts by using AzCopy


    (If the reply was helpful please don't forget to upvote and/or accept as answer, thank you)

    Regards

    Andreas Baumgarten

    0 comments No comments

  2. yogendra chauhan 0 Reputation points
    2024-03-13T12:54:27.4533333+00:00

    Using azcopy

    azcopy copy 'https://<source-storage-account-name>.file.core.windows.net/<file-share-name>/<directory-path><SAS-token>' 'https://<destination-storage-account-name>.file.core.windows.net/<file-share-name><SAS-token>' --recursive

    0 comments No comments

  3. Nehruji R 7,306 Reputation points Microsoft Vendor
    2024-03-14T10:01:08.19+00:00

    Hello KEERTHANA JAYADEVAN,

    Greetings! Welcome to Microsoft Q&A forum.

    To copy data from one directory in an Azure storage account to another directory, you can use the AzCopy v10 command-line utility. You can copy blobs, directories, and containers between storage accounts by using the AzCopy v10 command-line utility.

    To see examples for other types of tasks such as uploading files, downloading blobs, and synchronizing with Blob storage, see the links presented in the Next Steps section of this article.

    AzCopy uses server-to-server APIs, so data is copied directly between storage servers. These copy operations don't use the network bandwidth of your computer. Copy blobs between Azure storage accounts by using AzCopy

    Transfer data with AzCopy and file storage

    AzCopy uses server to server APIs when copying data between storage accounts and should not use the local bandwidth of your computer. It is currently the fastest way to migrate data between accounts. To increase performance, you can try decreasing the logging level, separating the copy into multiple operations, and turn off length checking. See Optimize for large numbers of small files for more information.

    Learn how to use the Azure CLI, AzCopy, and .NET code to copy blobs between storage accounts. Copy blobs between Azure Storage accounts from the command line or by using code

    This article provides an overview of some of the common Azure data transfer solutions. The article also links out to recommended options depending on the network bandwidth in your environment and the size of the data you intend to transfer. Choose an Azure solution for data transfer.

    Additional information: You may need initially a Storage Blob Data Contributor role on the storage account to perform this activity on copying data from A to B storage account/Subscription.

    If you copy to or from an account that has a hierarchical namespace, use blob.core.windows.net instead of dfs.core.windows.net in the URL syntax. Multi-protocol access on Data Lake Storage enables you to use blob.core.windows.net, and it is the only supported syntax for account to account copy scenarios.

    Alternatively, you can try using Azure Data Factory to perform this activity,

    • Azure Data Factory is a cloud-based data integration service that allows you to create, schedule, and manage data-driven workflows.
    • You can create a pipeline in Azure Data Factory to copy data from one directory (source) to another (destination).
    • Set up a Copy Data activity within your pipeline, specifying the source and destination datasets (directories).
    • This approach provides more flexibility and scalability for complex data movement scenarios.

    Hope this answer helps! Please let us know if you have any further queries. I’m happy to assist you further.


    Please "Accept the answer” and “up-vote” wherever the information provided helps you, this can be beneficial to other community members.


  4. JohnyBenz 316 Reputation points
    2024-03-16T11:23:34.89+00:00

    Try Azure Data Factory (ADF) :

    • ADF is a cloud-based ETL (Extract, Transform, Load) service for data integration and automation.
    • You can create an ADF pipeline to copy data between Azure Blob Storage containers.
    • ADF offers a visual interface and allows for scheduling and orchestration of data movement tasks.
    • Here are some resources to get started with ADF copy activity for blobs: https://video2.skills-academy.com/en-us/azure/data-factory/connector-azure-blob-storage

    On the other hand :

    Try copy tools like AvePoint, Gs Richcopy360, Cloudfuze, and Multi-Cloud, all can copy between Azure directories easily

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.