How to upload large VHDX file by Azcopy

James Vega 41 Reputation points
2024-08-06T21:07:04.63+00:00

I'm trying to upload a 100+ GB of VHDX file in a Azure Storage inside a container resource by Azure Portal but I get always Out of Memory error on Edge, despite I disable hardware acceleration and other settings.

I'm trying to upload it by using Azcopy and using the SAS token to authorize the process but the process always fails.

The code I'm using for uploading the image is:

$StorageAccountName = "athenaosimages"
$ContainerName = "athenaarch"
$LocalPath = "C:\ProgramData\Microsoft\Windows\Virtual Hard Disks\AthenaArch.vhdx"
$SAS = "sp=....."
$URI = "https://$StorageAccountName.blob.core.windows.net/$ContainerName?$SAS"
.\azcopy.exe copy "$LocalPath" "$URI" --recursive

but I get:

INFO: Scanning...
INFO: Autologin not specified.
INFO: Authenticating to destination using Unknown, Please authenticate using Microsoft Entra ID (https://aka.ms/AzCopy/AuthZ), use AzCopy login, or append a SAS token to your Azure URL.
INFO: Any empty folders will not be processed, because source and/or destination doesn't have full folder support

Job 5383dff6-14fd-e242-6d90-00c70398041f has started
Log file is located at: C:\Users\<user>\.azcopy\5383dff6-14fd-e242-6d90-00c70398041f.log   0.0 %, 0 Done, 1 Failed, 0 Pending, 0 Skipped, 1 Total,

and the log file:

2024/08/06 20:56:30 AzcopyVersion  10.26.0
2024/08/06 20:56:30 OS-Environment  windows
2024/08/06 20:56:30 OS-Architecture  amd64
2024/08/06 20:56:30 Log times are in UTC. Local time is 6 Aug 2024 22:56:30
2024/08/06 20:56:31 ISO 8601 START TIME: to copy files that changed before or after this job started, use the parameter --include-befo
re=2024-08-06T20:56:25Z or --include-after=2024-08-06T20:56:25Z
2024/08/06 20:56:31 Authenticating to destination using Unknown, Please authenticate using Microsoft Entra ID (https://aka.ms/AzCopy/A
uthZ), use AzCopy login, or append a SAS token to your Azure URL.
2024/08/06 20:56:31 Any empty folders will not be processed, because source and/or destination doesn't have full folder support
2024/08/06 20:56:31 Job-Command copy C:\ProgramData\Microsoft\Windows\Virtual Hard Disks\AthenaArch.vhdx https://athenaosimages.blob.c
ore.windows.net/sp=racwdli&st=2024-08-06T20:24:36Z&se=2024-08-07T04:24:36Z&spr=https&sv=2022-11-02&sr=c&sig=-REDACTED- --recursive 
2024/08/06 20:56:31 Number of CPUs: 12
2024/08/06 20:56:31 Max file buffer RAM 6.000 GB
2024/08/06 20:56:31 Max concurrent network operations: 192 (Based on number of CPUs. Set AZCOPY_CONCURRENCY_VALUE environment variable
 to override)
2024/08/06 20:56:31 Check CPU usage when dynamically tuning concurrency: true (Based on hard-coded default. Set AZCOPY_TUNE_TO_CPU env
ironment variable to true or false override)
2024/08/06 20:56:31 Max concurrent transfer initiation routines: 64 (Based on hard-coded default. Set AZCOPY_CONCURRENT_FILES environm
ent variable to override)
2024/08/06 20:56:31 Max enumeration routines: 16 (Based on hard-coded default. Set AZCOPY_CONCURRENT_SCAN environment variable to over
ride)
2024/08/06 20:56:31 Parallelize getting file properties (file.Stat): false (Based on AZCOPY_PARALLEL_STAT_FILES environment variable)
2024/08/06 20:56:31 Max open files when downloading: 2147483048 (auto-computed)
2024/08/06 20:56:31 Final job part has been created
2024/08/06 20:56:31 Final job part has been scheduled
2024/08/06 20:56:31 INFO: [P#0-T#0] Starting transfer: Source "\\\\?\\C:\\ProgramData\\Microsoft\\Windows\\Virtual Hard Disks\\AthenaA
rch.vhdx" Destination "https://athenaosimages.blob.core.windows.net/sp=racwdli&st=2024-08-06T20:24:36Z&se=2024-08-07T04:24:36Z&spr=htt
ps&sv=2022-11-02&sr=c&sig=-REDACTED- Specified chunk size 4194304
2024/08/06 20:56:31 ==> REQUEST/RESPONSE (Try=1/17.516ms, OpTime=114.7552ms) -- RESPONSE STATUS CODE ERROR
   PUT https://athenaosimages.blob.core.windows.net/sp=racwdli&st=2024-08-06T20:24:36Z&se=2024-08-07T04:24:36Z&spr=https&sv=2022-11-02
&sr=c&sig=-REDACTED-
   Accept: application/xml
   Content-Length: 0
   User-Agent: AzCopy/10.26.0 azsdk-go-azblob/v1.4.0 (go1.22.5; Windows_NT)
   X-Ms-Client-Request-Id: 515fc517-b48e-43a9-5f48-7793cc386973
   x-ms-blob-content-length: 136369405952
   x-ms-blob-content-type: application/octet-stream
   x-ms-blob-sequence-number: 0
   x-ms-blob-type: PageBlob
   x-ms-version: 2023-08-03
   --------------------------------------------------------------------------------
   RESPONSE Status: 401 Server failed to authenticate the request. Please refer to the information in the www-authenticate header.
   Content-Length: 302
   Content-Type: application/xml
   Date: Tue, 06 Aug 2024 20:56:31 GMT
   Server: Windows-Azure-Blob/1.0 Microsoft-HTTPAPI/2.0
   Www-Authenticate: Bearer authorization_uri=https://login.microsoftonline.com/961f255c-920f-4f73-87c0-625de0dbf74c/oauth2/authorize 
resource_id=https://storage.azure.com
   X-Ms-Client-Request-Id: 515fc517-b48e-43a9-5f48-7793cc386973
   X-Ms-Error-Code: NoAuthenticationInformation
   X-Ms-Request-Id: 21894279-501e-0008-3a43-e8bd9e000000
   X-Ms-Version: 2023-08-03
Response Details: <Code>NoAuthenticationInformation</Code><Message>Server failed to authenticate the request. Please refer to the i
nformation in the www-authenticate header. </Message>

2024/08/06 20:56:31 ERR: [P#0-T#0] UPLOADFAILED: \\?\C:\ProgramData\Microsoft\Windows\Virtual Hard Disks\AthenaArch.vhdx : 401 : 401 S
erver failed to authenticate the request. Please refer to the information in the www-authenticate header.. When Creating blob. X-Ms-Re
quest-Id: 21894279-501e-0008-3a43-e8bd9e000000

   Dst: https://athenaosimages.blob.core.windows.net/sp=racwdli&st=2024-08-06T20:24:36Z&se=2024-08-07T04:24:36Z&spr=https&sv=2022-11-0
2&sr=c&sig=-REDACTED-
2024/08/06 20:56:31 JobID=5383dff6-14fd-e242-6d90-00c70398041f, Part#=0, TransfersDone=1 of 1
2024/08/06 20:56:31 all parts of entire Job 5383dff6-14fd-e242-6d90-00c70398041f successfully completed, cancelled or paused
2024/08/06 20:56:31 is part of Job which 1 total number of parts done 
2024/08/06 20:56:33 PERF: primary performance constraint is Unknown. States: X:  0, O:  0, M:  0, L:  0, R:  0, D:  0, W:  0, F:  0, B
:  0, E:  0, T:  0, GRs: 192
2024/08/06 20:56:33 0.0 %, 0 Done, 1 Failed, 0 Pending, 0 Skipped, 1 Total, 
2024/08/06 20:56:33

I also provided all permissions on SAS token (I created it as "Account Key" in Container -> Shared Access Tokens.

I also tried to set the container access level to "Anonymous read access for containers and blobs":

PS C:\> Get-AzStorageContainer -Context $ctx | Select Name, PublicAccess

Name       PublicAccess
----       ------------
athenaarch    Container

but still getting the error.

Azure Storage Explorer
Azure Storage Explorer
An Azure tool that is used to manage cloud storage resources on Windows, macOS, and Linux.
252 questions
Azure Storage Accounts
Azure Storage Accounts
Globally unique resources that provide access to data management services and serve as the parent namespace for the services.
3,105 questions
Azure Blob Storage
Azure Blob Storage
An Azure service that stores unstructured data in the cloud as blobs.
2,787 questions
{count} votes

Accepted answer
  1. Nehruji R 7,306 Reputation points Microsoft Vendor
    2024-08-14T05:00:33.1966667+00:00

    Hello James Vega, I'm glad that you were able to resolve your issue and thank you for posting your solution so that others experiencing the same thing can easily reference this!

    Since the Microsoft Q&A community has a policy that "The question author cannot accept their own answer. They can only accept answers by others ", I'll repost your solution in case you'd like to "Accept " the answer. Accepted answers show up at the top, resulting in improved discoverability for others.

    Issue: Customer trying to upload a 100+ GB of VHDX file in Azure Storage inside a container resource and encountering an error message on authentication failure.

    Error Message:

    UPLOADFAILED: \?\C:\ProgramData\Microsoft\Windows\Virtual Hard Disks\AthenaArch.vhdx : 401 : 401 S erver failed to authenticate the request. Please refer to the information in the www-authenticate header.. When Creating blob. X-Ms.

     

    Cause: The issue was on the script because of wrong destination URL by missing /? in $URI.

    Solution: After modifying the script as below, the issue got mitigated.

    $StorageAccountName = "athenaosimages"
    $ContainerName = "athenaarch"
    $LocalPath = "C:\ProgramData\Microsoft\Windows\Virtual Hard Disks\athenaarch.vhdx"
    $SAS = "sp=...%3D"
    $URI = "https://$StorageAccountName.blob.core.windows.net/$ContainerName/?$SAS"
    .\azcopy.exe copy "$LocalPath" "$URI" --recursive=true
    
    
    

    reference docs: https://video2.skills-academy.com/en-us/azure/storage/common/storage-use-azcopy-v10

    1 person found this answer helpful.
    0 comments No comments

3 additional answers

Sort by: Most helpful
  1. Vlad Costa 1,480 Reputation points
    2024-08-07T09:20:30.4866667+00:00

    Hi @James Vega

    Have you tried uploading the VHDX file using Azure Storage Explorer? You can download it using the link below:

    https://azure.microsoft.com/en-us/products/storage/storage-explorer


    If this answers your question, please click Accept Answer and Yes if this answer was helpful. Doing so would help other community members with similar issues identify the solution. I highly appreciate your contribution to the community.


  2. Nehruji R 7,306 Reputation points Microsoft Vendor
    2024-08-08T07:30:07.3533333+00:00

    Hello James Vega,

    Greetings! Welcome to Microsoft Q&A Platform.

    I understand that you are trying to upload a 100+ GB of VHDX file in Azure Storage inside a container resource and encountering an error message on authentication failure.

    Please check below following troubleshooting steps to resolve the issue,

    This tutorial shows you deploy an application that uploads large amount of random data to an Azure storage account: Upload large amounts of random data in parallel to Azure storage

    I would recommend to use the Azcopy is the best tool to transfer the Data. Allow more accurate values for job status in jobs commands, e.g. completed with failed or skipped transfers. Using Azcopy you can use for uploading large files upto 4TiB. Please refer to this article for more information

    If the transfer job error does not result from the sas token or authentication, you could try below command line from this link.

    Show the error message of the failed job: azcopy jobs show <job-id> --with-status=Failed then execute resume command If large number of failures occasionally. A simple solution for the scoped scenario of populating a blank directory is to run the azcopy copy command again with the --overwrite=false command.

    This article provides an overview of some of the common Azure data transfer solutions. The article also links out to recommended options depending on the network bandwidth in your environment and the size of the data you intend to transfer. Please see here.

    Hope this answer helps! please let us know if you have any further queries. I’m happy to assist you further.

     Please "Accept the answer” and “up-vote” wherever the information provided helps you, this can be beneficial to other community members.

    0 comments No comments

  3. James Vega 41 Reputation points
    2024-08-11T18:52:32.5+00:00

    I found the error. The issue was on my script because I typed wrong destination URL by missing /? in $URI. The right script is:

    $StorageAccountName = "athenaosimages"
    $ContainerName = "athenaarch"
    $LocalPath = "C:\ProgramData\Microsoft\Windows\Virtual Hard Disks\AthenaArch.vhd"
    $SAS = "sp=...%3D"
    $URI = "https://$StorageAccountName.blob.core.windows.net/$ContainerName/?$SAS"
    .\azcopy.exe copy "$LocalPath" "$URI" --recursive=true
    

    Useful docs: https://video2.skills-academy.com/en-us/azure/storage/common/storage-use-azcopy-v10

    After fixing this, I got the following error:

    409 the blob type is invalid for this operation
    

    but I needed to rename the source file AthenaArch.vhdx to all lowercase letters, and then it worked.

    So the right final PowerShell script is like:

    $StorageAccountName = "athenaosimages"
    $ContainerName = "athenaarch"
    $LocalPath = "C:\ProgramData\Microsoft\Windows\Virtual Hard Disks\athenaarch.vhdx"
    $SAS = "sp=...%3D"
    $URI = "https://$StorageAccountName.blob.core.windows.net/$ContainerName/?$SAS"
    .\azcopy.exe copy "$LocalPath" "$URI" --recursive=true
    
    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.