4 TB File Transfer Timing Out

Gabe Quinn 0 Reputation points
2023-09-26T13:57:25.79+00:00

When receiving a large data transfer (4 TB) the .vhd file times out.

Are there any settings on either end of the Azure Storage Accounts that need to be set in order for this to work?

Azure Storage Explorer
Azure Storage Explorer
An Azure tool that is used to manage cloud storage resources on Windows, macOS, and Linux.
240 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Sumarigo-MSFT 44,906 Reputation points Microsoft Employee
    2023-09-28T05:41:38.3133333+00:00

    @Gabe Quinn Welcome to Microsoft Q&A Forum, Thank you for posting your query here!

    When transferring large files to Azure Storage, there are a few settings that you can adjust to optimize the transfer and avoid timeouts. Here are a few things to consider:

    1. Use AzCopy: AzCopy is a command-line tool that you can use to transfer data to and from Azure Storage. It's designed to handle large files and can automatically resume transfers if they're interrupted. You can also use AzCopy to set the maximum number of concurrent connections and the maximum size of each transfer.

    Fast Data Transfer is a tool for fast upload of data into Azure – up to 4 terabytes per hour from a single client machine. It moves data from your premises to Blob Storage, to a clustered file system, or direct to an Azure VM. It can also move data between Azure regions.

    1. Increase the timeout value: By default, Azure Storage has a timeout value of 30 seconds for each request. If you're transferring a large file, you may need to increase this value to avoid timeouts. You can do this by setting the ServerTimeout property in your storage client library.
    2. Use a storage account with premium performance: If you're transferring a large file, you may want to consider using a storage account with premium performance. Premium performance storage accounts have higher IOPS and throughput limits, which can help improve the speed of your transfer.
    3. Use a dedicated connection: If you're transferring a large file, you may want to consider using a dedicated connection to Azure Storage. This can help improve the speed of your transfer and reduce the risk of timeouts. You can set up a dedicated connection using Azure ExpressRoute or a VPN gateway.

    There are several options for transferring data to and from Azure, depending on your needs. https://video2.skills-academy.com/en-us/azure/architecture/data-guide/scenarios/data-transfer

    This article provides an overview of some of the common Azure data transfer solutions. The article also links out to recommended options depending on the network bandwidth in your environment and the size of the data you intend to transfer.

    Data transfer for large datasets with moderate to high network bandwidth

    User's image

    • If the network transfer is projected to be too slow, you should use a physical device. The recommended options in this case are the offline transfer devices from Azure Data Box family or Azure Import/Export using your own disks.
    • Azure Data Box family for offline transfers – Use devices from Microsoft-supplied Data Box devices to move large amounts of data to Azure when you're limited by time, network availability, or costs. Copy on-premises data using tools such as Robocopy. Depending on the data size intended for transfer, you can choose from Data Box Disk, Data Box, or Data Box Heavy.
      • Azure Import/Export – Use Azure Import/Export service by shipping your own disk drives to securely import large amounts of data to Azure Blob storage and Azure Files. This service can also be used to transfer data from Azure Blob storage to disk drives and ship to your on-premises sites.

    Please let us know if you have any further queries. I’m happy to assist you further.     .


    Please do not forget to "Accept the answer” and “up-vote” wherever the information provided helps you, this can be beneficial to other community members.

    0 comments No comments