Hi there
We are creating many transaction lines inside of a loop. We create the transaction lines add them to a list and then commit the list once the loop has been completed. We trigger task queues inside the loop that need the transaction line as an input parameter, but the transaction line needs to be commited for the task queue to work. .
What would be the best solution for the performance of the system?
1) Just commit the transaction line inside the loop so the task queue does not break. The upside to this is we dont have to retrieve the transaction line as we have it as a input parameter. The transaction line table is massive.
2) Dont commit the transaction line in the loop and just pass through the transaction line ID. With the ID we can retrieve the transaction line from the database as the Task Queues only triggers once the master microflow is complete.
What will be best to do? or is there a better way? The transaction table has 100 million plus lines so we try to retrieve/commit as little as possible.
Regards,
Patrick
This way you have only one commit and only the retrieves in each taskqueue-job itself.
You can optimize this if you do not pass a single object to the taskqueue object, but a list of objects. Depending on the size of the object, you can retrieve maybe 1000 of 500 or 2000 at once, and thus reduce the amount of retrieves.
Alas, I just now tried this but