move data from one container to another in databricks

Shambhu Rai 1,406 Reputation points
2023-12-11T23:23:22.1233333+00:00

HI Expert,

how we can move files from one container to another using timestamp last modified

i.e. mnt/test/tes1 to mnt/test/test2

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,409 questions
Azure Blob Storage
Azure Blob Storage
An Azure service that stores unstructured data in the cloud as blobs.
2,575 questions
Azure Data Explorer
Azure Data Explorer
An Azure data analytics service for real-time analysis on large volumes of data streaming from sources including applications, websites, and internet of things devices.
501 questions
Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,042 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,994 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Vinodh247-1375 12,506 Reputation points
    2023-12-12T06:43:44.88+00:00

    Hi
    Shambhu Rai
    :

    Thanks for reaching out to Microsoft Q&A.

    Yes, you can do this using ADF pipelines.

    You can use the 'Filter by last modified' condition to pick particular files based on last modified Start and endtime UTC. I have blogged about it on how to use this filter by adding/subtracting to the current datetime to filter and copy files.

    https://vinsdata.wordpress.com/2021/10/25/incremental-file-copy-in-azure-data-factory/

    Note:

    1. The use case i have blogged about is about incremental file copy but you can replicate the idea to suit your solution by doing only some minor changes.
    2. In adf pipeline there is no move activity but you can achieve this using copy activity followed by a delete activity.

    Let me know if you have any questions.

    Please 'Upvote'(Thumbs-up) and 'Accept' as answer if the reply was helpful. This will be benefitting other community members who face the same issue.