Multi Tenant and Event Grid trigger.

jigsm 236 Reputation points
2020-11-03T05:43:22.007+00:00

Respected,

We have our website which supports multiple tenants.

We are designing Azure Data Factory where in, each tenant will upload a file in Azure Blob Storage that need to processed and once the file is uploaded then the trigger

configured with the pipeline will activate the pipeline.

We have organized of blob structure as follows:

import\tenant1\incoming\import.csv

import\tenant2\incoming\import.csv

import\tenant3\incoming\import.csv

import is the name of the container and have created sub folders based on the tenant names.
File will be dropped in the incoming folder of each tenant.

Now, my questions is how to configure the event based trigger which will get activated when a blob is added to the incoming folder of each tenant.

Any pointers will be very helpful.

Regards

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,567 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. Saurabh Sharma 23,801 Reputation points Microsoft Employee
    2020-11-04T22:24:10.2+00:00

    @jigsm You can try the below approach if this helps:
    While creating the event trigger you can leave the "Blob path begins With" property as blank and "Blob path ends with" as ".csv" to consider all incoming blobs to the container which can reside under any of the underlying folders tenant1 or tenant2.
    37505-image.png
    Also, when you click continue you can see the blobs being detected under any of these folders.
    37562-image.png

    The event trigger captures the folder path and file name of the blob into the properties @triggerBody().folderPath and @triggerBody().fileName. So, to use the values of these properties in your pipeline you need to map these properties to pipeline parameters. Then you can access the values captured by trigger through @pipeline().parameter.parameterName in your pipeline.
    37466-image.png

    So, your pipeline will be triggered when a new file is uploaded in your blob container in either tenant1 or tenant2 folder path. (see trigger runs screenshot below)
    37601-image.png

    Only problem with this approach is the pipeline will be triggered even if the blob is uploaded in the container outside of the folder structure (tenant1, tenant2 etc.) as well. If your csv file is always going to come under these folders then you won't have any issues otherwise you may need to add logic in your pipeline to perform activity if file comes under your tenant specific folder paths.

    ----------

    Please do not forget to "Accept the answer" wherever the information provided helps you to help others in the community.


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.