Azure Data Lake Gen2 PUT authorization

Magnus Vhendin 1 Reputation point
2020-09-25T07:22:12.537+00:00

I'm trying to create a Shared Access Signature client side in my Node app. The reason being that I do not want to stream files through my app. I want the user to be able to upload a file to my Azure Data Lake Gen2 Blob Storage container directly.

I have looked at all examples I can find, but they are all server side. So I tried to generate generateDataLakeSASQueryParameters (from the javascript SDK) and use them in the PUT request. The process looks like it works and I return it to the client.

Server side:

async getFileUploadUrl(path) {
    const now = new Date().toUTCString();
    const startsOn = new Date(now);
    startsOn.setMinutes(startsOn.getMinutes() - 10); // Skip clock skew with server

    const expiresOn = new Date(now);
    expiresOn.setHours(expiresOn.getHours() + 1); // Expires in one hour

    const sharedKeyCredential = new StorageSharedKeyCredential(this.storageAccountName, this.accountKey);

    const sas = generateDataLakeSASQueryParameters({
        fileSystemName: this.fileSystemClient.name,
        ipRange: { start: "0.0.0.0", end: "255.255.255.255" },
        expiresOn,
        protocol: SASProtocol.HttpsAndHttp,
        permissions: DataLakeSASPermissions.parse("c").toString(), //  Read (r), Write (w), Delete (d), List (l), Add (a), Create (c), Update (u), Process (p)
        resourceTypes: AccountSASResourceTypes.parse("o").toString(), //  Service (s), Container (c), Object (o)
        services: AccountSASServices.parse("b").toString(), //  Blob (b), Table (t), Queue (q), File (f)
        startsOn,
        version: "2019-12-12"
    },
    sharedKeyCredential);



    const encodedURI = encodeURI(path);


    const filePath = `${this.fileSystemClient.url}/${encodedURI}`; 

    return {
        url: filePath,
        signature: sas.signature,
    };
}

Client side:

const { url, signature } = serverResponse;

const file = [file taken from an input tag];

const request = new XMLHttpRequest();
    request.open('PUT', url);
    request.setRequestHeader("x-ms-date", new Date().toUTCString());
    request.setRequestHeader("x-ms-version", '2019-12-12');
    request.setRequestHeader("x-ms-blob-type", 'BlockBlob');
    request.setRequestHeader("Authorization", `SharedKey [storageaccount]:${signature}`);

request.send(file);

And what I keep getting back is a 403 with the following error:

The MAC signature found in the HTTP request '[signature]' is not the
same as any computed signature. Server used following string to sign:
'PUT\n\n\n1762213\n\nimage/png\n\n\n\n\n\n\nx-ms-date:Thu, 24 Sep 2020
12:24:05 GMT\nx-ms-version:2019-12-12\n/[account name]/[container
name]/[folder name]/image.png'.

Obviously I removed the actual signature since I have gotten it to work server side, but it looks something like this: hGhg765+NIGjhgluhuUYG686dnH90HKYFytf6= (I made this up, but it looks as if it's in the correct format).

I have also tried to return the parsed query string and used in a PUT request, but then I get errors stating there is a required header missing, and I cannot figure out which one that should be. No Authorization for instance should be required.

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,410 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Magnus Vhendin 1 Reputation point
    2020-09-28T06:59:30.207+00:00

    So I'm attempting to create a SAS url that I can do a PUT on. New and updated, this is what I do server side:

    async getFileUploadUrl(path) {
            const now = new Date().toUTCString();
            const startsOn = new Date(now);
            startsOn.setMinutes(startsOn.getMinutes() - 10); // Skip clock skew with server
    
            const expiresOn = new Date(now);
            expiresOn.setHours(expiresOn.getHours() + 1); // Expires in one hour
    
            const encodedURI = encodeURI(path);
    
            const sas = generateDataLakeSASQueryParameters({
                fileSystemName: this.fileSystemClient.name, // Created from my container I want to work in.
                fileName: path, // Filename including its path
                permissions: DataLakeSASPermissions.parse("c"), // c as in Create
                startsOn,
                expiresOn,
                ipRange: { start: "0.0.0.0", end: "255.255.255.255" }, // All for dev purposes
                protocol: SASProtocol.HttpsAndHttp, // Both for now (dev purposes)
            },
                new StorageSharedKeyCredential(this.storageAccountName, this.accountKey)
            );
    
    
            const sasUrl = `${this.fileSystemClient.url}/${encodedURI}?${sas.toString()}`;
    
            return {
                sasUrl,
            };
        }
    

    And then client side:

    const { sasUrl } = response;
    const request = new XMLHttpRequest();
    request.open('PUT', sasUrl);
    request.setRequestHeader("x-ms-date", new Date().toUTCString());
    request.setRequestHeader("x-ms-blob-type", 'BlockBlob');
    
    request.send(file)
    

    All I get in response is 400: An HTTP header that's mandatory for this request is not specified.