Heap Memory issue while processing lakhs of records to excel in azure app service

Koteswara Pentakota 71 Reputation points
2022-06-16T12:43:24.447+00:00

Hi Team,

We have a task to process 10 lakhs of records from Mysql DB, decrypt sensitive data of some fields and push it to an excel. This process is giving heap memory issue on App service and sometimes does not respond. We are using S3 pricing tier with 7 GB Ram.

The memory issue happens when we complete reading the records and start the loop to decrypt each of the record for some columns and push to excel in a container.

Any pointers on how to avoid these memory issues. Not sure if upgrading the tier for this task alone is an ideal recommendation.

Thanks,
Kotesh

Azure ISV (Independent Software Vendors) and Startups
Azure ISV (Independent Software Vendors) and Startups
Azure: A cloud computing platform and infrastructure for building, deploying and managing applications and services through a worldwide network of Microsoft-managed datacenters.ISV (Independent Software Vendors) and Startups: A Microsoft program that helps customers adopt Microsoft Cloud solutions and drive user adoption.
91 questions
Azure App Service
Azure App Service
Azure App Service is a service used to create and deploy scalable, mission-critical web apps.
7,269 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. AnuragSingh-MSFT 21,236 Reputation points
    2022-06-17T08:14:33.49+00:00

    Hi @Koteswara Pentakota ,

    Thank you for reaching out to Microsoft Q&A for this question.

    I understand that you are trying to read large amount of data from MySql DB --> process it in your application --> store it in Excel. Below are some suggestions to investigate this issue. I am assuming that you are using .NET for this application, therefore some of the suggestions are specific to it. However, the underlying concept would work for any other Language/Framework

    1. Instead of reading the entire data from MySql DB in application memory, read and process them in batch. For example, you may use reader related class or use LIMIT clause with SELECT statement to read rows in batch of 100/1000/10000 etc.. Refer to this link for details. This would avoid the huge memory requirement only for storing data read from MySql DB. The rows can be processed in smaller batch and writted to Excel sheet.

    2. Ensure that Dispose() and using{} are used wherever applicable. This would avoid having the objects rooted, even when they are not required and thus Garbage collection would be able to clean them up from memory. ref: Memory release

    3. If the steps above do not help, use diagnostic tools like perfview to analyze memory leak. The following post has a great guideline to get started with diagnosing memory leak in Azure App Service - Diagnosing Memory Leaks in Azure App Services with PerfView

    4. You might also consider running this application on your VM/on-premise machine with smaller volume of test dataset to observe behavior closely, diagnose and troubleshoot the memory leak issue. There are a number of tools that can help you with it like -
    Perfview
    Visual Studio
    dotnet-dump (dotnet cli based)

    Increasing the tier of the application could help with an easy/quick fix, but you might again start facing such issues in future when data volume increases in DB or after adding additinal features. The memory leak should be investigated and memory usage should be optimized for this application. You may also connect with Microsoft Support to seek help in investigating this issue with this application.

    Please let me know if you have any questions.

    ---
    Please 'Accept as answer' and ‘Upvote’ if it helped so that it can help others in the community looking for help on similar topics.