Issue with max block count in Azure Data Lake Gen2
Starting from 03/02/23 I have noticed that my Stream Analytics Job has been creating more files per day than before. Looking through the files' properties I discovered that every file maxes out at 10000 blocks and then the SA Job creates another one, while before the max count was 50000 blocks. I was not able to find any change on the max block count and according to these pages: https://video2.skills-academy.com/en-us/azure/stream-analytics/blob-storage-azure-data-lake-gen2-output, https://video2.skills-academy.com/en-us/azure/azure-resource-manager/management/azure-subscription-service-limits#storage-limits the limit is still 50000 blocks per file in Azure Blob Storage. I also noticed a brief stop of the SA job that morning and a Health Event Activated error with no extra details.