Commit over 700 thousand records

0
I have an entity with over 700 thousand records (in Acceptance, could be more in Production), and I have to change an attribute value for all the objects. I have made a microflow and while retrieving I am using Custom range for 500 objects and looping them over and committing them then going to the start of the microflow and repeating it until all the objects are changed. The issue is since the objects are so many it gives an error Contact the system Administrator screen after running for some time, I debugged it and it is changing the attributes as intended, but getting timeout after some time.  I am not sure if making it in a scheduled event to run 1 time would help or not. Is there a way to avoid this timeout issue?
asked
2 answers
3

Hi Prashant,

could you please share the microflow you built with us and the errormessage in the console/application log? 

If there is no error in the console/application log, it might be caused by triggering the microflow “ synchronously”. In this case an error will be visible in your browser log. Very boldly it means the functionality takes too long to finish and loses connection between client and server, causing an error only on the client side. 

answered
0

 

  • commit the objects outside of the loop
  • implement offset and limit logic to process the records. Check the logic here Limit & Offset implementation 
answered