Thanks for asking question! For Azure Cognitive Search, there are several approaches for accommodating larger data sets, ranging from how you structure.
When pushing data into an index using the Add Documents REST API or the IndexDocuments method (.NET), there are several key considerations that impact indexing speed. You may want to refer to below factors and range from setting service capacity to code optimizations.
- Capacity of your service
- Review index schema
- Check the batch size
- Number of threads/workers
As using batches to index documents will significantly improve indexing performance. Determining the optimal batch size for your data is a key component of optimizing indexing speeds. The two primary factors influencing the optimal batch size are:
• The schema of your index
• The size of your data
Because the optimal batch size depends on your index and your data, the best approach is to test different batch sizes to determine what results in the fastest indexing speeds for your scenario.
Refer you to this tutorial which provides sample code for testing batch sizes.
Check this official document link might be helpful: https://video2.skills-academy.com/en-us/azure/search/search-howto-large-index
Let us know if issue remains or further query on this.