@Johny Yagappan Welcome to Microsoft Q&A platform and thanks for posting your question.
I'm glad that you were able to resolve your issue and thank you for posting your solution so that others experiencing the same thing can easily reference this! Since the Microsoft Q&A community has a policy that "The question author cannot accept their own answer. They can only accept answers by others "I'll repost your solution in case you'd like to accept the answer.
AskI have a spark job definition in python file and passing the blob fileName and folderPath as parameters using Dynamic Content and Pipeline expression builder. I have defined the parameters as follow
fileName default value to @triggerBody().fileName
folderPath default value to @triggerBody().folderPath
In the spark job definition, I am using Dynamic Content and Pipeline expression builder to pass the fileName and folderPath as pipeline parameters to pass them as Command line arguments
@array(concat(pipeline().parameters.fileName,' ',pipeline().parameters.folderPath))
when I trigger the pipeline with trigger file 20230404.trg, the pipeline is being executed, however I am not getting the blob fileName and folderPath as parameters to the spark job. I am printing the arguments as follow
arguments = sys.argv[1:] # Get arguments passed to the Spark job
print("SYS ARGUMENTS::", str(arguments))
The sys.argv[1:] output is as follow
SYS ARGUMENTS:: ['@triggerBody().fileName @triggerBody().folderPath']
Why I am not getting the actual fileName and folderPath.
I am expecting fileName as 20230404.trg and folderPath as abfss://xxxxxxxxx@xxxxxxxxx.dfs.core.windows.net/AI'
Your response is greatly appreciated.
Solution :I figured that issue and fixed it. The issue was with the trigger definition where these parameters have no value set. when I set the default value as follow fileName to @triggerBody().fileName
and folderPath to @triggerBody().folderPath. Then I was able to access the
fileName and folderPath values in the pyspark script.
If I missed anything please let me know and I'd be happy to add it to my answer, or feel free to comment below with any additional information.
If you have any other questions, please let me know. Thank you again for your time and patience throughout this issue.
Please don’t forget to Accept Answer
and Yes
for "was this answer helpful" wherever the information provided helps you, this can be beneficial to other community members.