Commit Bulk objects from microflow

Hi, I have a requirement where i need to retrive 50-90k objects and need to change some attributes and need to commit. Initially i tried changing them and commit it outside of loop microflow. But since i am commiting more records, The application auto restarted when i executed the microflow. I tried using retriving in chunks 1k records at a time and commit them according to this docs It worked but the app is auto restarting when the execution is finished  with heap memory and cpu utilization drastically increased in 3-5 sec. I can to know that all the commit actions will take place when the microflow reached to end state, and thats the reason all objects are commit at once. I came to know that Start and End transaction Java actions will help to commit the chunk objects instead of commiting all at end of microflow, But is it good regarding performance when called multiple times?    Could someone please help me on what is the best approach for the problem with respact to performance? If you have any docs, Please share, It would be helpful.    Thanks in advance.
2 answers

Hello Thmanampudi Lokesh Parameswara Reddy,


Start and End Transaction could help you because like the actions says it ends a transaction in the system freeing up some memory.

What is also a possibility is maybe to use a taskqueue with multiple threads to execute your action or a scheduled event to execute your action. This can also be done in batches if you like.

Another more radical way outside of the coding is to use a sql tool for example to bulk change and commit however this is extremely risk so be sure what you are doing.


Probably there are other actions you can take, this is just a set.


Hope this helps,


Good luck!



I guess that batch processing is your case. Have a look a this Academy course with a ready-to-adapt solution for batch processing using limit and offset.