@Raj Bisht - Thanks for the question and using MS Q&A platform.
Here is the response shared by the internal team:
Yes, bulk inserts are the default and only option when using SQL Database output. Internally Azure Stream Analytics issues ADO.NET SQLBulkCopy API to issue these bulk inserts.
Based on my understanding, these two things can be done to achieve minimal logging.
- Table Lock: Use the TABLOCK hint to lock the table during the bulk insert operation. This is necessary for minimal logging.
- Non-Clustered Indexes: If your table has non-clustered indexes, minimal logging might not be fully achievable. Consider dropping the indexes before the bulk insert and recreating them afterward.
Regarding, #1 Azure Stream Analytics already specifies TABLOCK unless the destination table is a Memory optimized table.
If #2 is also not an issue for your table, please include SQL CSS for further recommendations.
Another alternative option to write to SQL DW is to try out the Synapse output, which internally uses COPY command to write to DW table. You can validate if this option works better in your setup.
Coming to the document which you had shared: Optimizing transactions in dedicated SQL pool in Azure Synapse Analytics applicable to Azure Synapse Analytics
Hope this helps. Do let us know if you any further queries.
------------
If this answers your query, do click Accept Answer
and Yes
for was this answer helpful. And, if you have any further query do let us know.