How can I utilize an automatic Python script to exclusively update documents with new data from a Microsoft Graph API to Microsoft Fabric Lakehouse?
Hello everyone,
I'm seeking some assistance regarding a Python script I have uploaded in a notebook within Microsoft Fabric. Currently, the script operates automatically, facilitating data transfer from the Microsoft Graph API to the Lakehouse.
Presently, the script uploads the entire CSV file using the following code snippet:
if "access_token" in result:
groups = getMicrosoftGraphApiGroups()
groups.to_csv("abfss://...my route.../groups.csv", sep = "#")
However, I'm interested in modifying the script's functionality. Specifically, I'm aiming for the script to connect with the existing data and only upload new data or data from that specific day. This way, the data refreshes daily with any new information.
I would greatly appreciate any insights, guidance, or suggestions you could offer to help me achieve this goal.
Thank you in advance for your time and assistance.
Best regards,
Ares