I've solved the issue.
TL;DR: The solution is to store the necessary files as blobs in a storage account container in Azure, and access from there with a Shared Access Token/Signature.
- Create a storage account in Azure.
- Create a storage container in the newly created storage account.
- The storage container is where you can actually store all the files (as blobs) that are needed by your script.
- Note: It is advisable that the access tier for blobs stored in the container is set to 'Hot (Inferred)' so that the file is retrieved from the container as fast as possible.
- If, like me, you need the storage container's access level to be private, you'll need to create a shared access token/signature (SAS) to access the files, rather than just hitting the url for the container/file.
- Note: The SAS should be for the entire container, rather than for each individual file.
- SASs don't have an infinite lifespan, so, rather than manually creating new SASs and updating the reference in your script, it is advisable to automate a process that creates a new SAS every day at a certain time and store the value in a key vault.
- This can be accomplished with a function app.
- The function app should:
- Create new SAS (can be accomplished with Azure.Identity and Azure.Storage.Blobs NuGet packages).
- Delete current SAS secret value in KeyVault if exists.
- Purge SAS secret value in KeyVault if deleted.
- Set SAS secret value in KeyVault.
- For accessing KeyVault with code, you will need to use NuGet packages Azure.Identity and Azure.Security.KeyVault.Secrets. Use this link for more information.
- From there, you can give your Azure Load Test resource access to the key vault, and then use code in your script to access the key vault via the key vault secret URI.
Here is an example on how to access key vault from JMeter code:
Create a 'User Defined Variables' config element, and use ${_GetSecret(secretUri)}
to retrieve the most current SAS value.