Question: How to load SAP Data in batches (Microflow)

0
Hello everyone,   I am loading up to 160k records from SAP HANA. On the server i get the following error: ConnectionBus: org.postgresql.util.PSQLException: ERROR: RELEASE SAVEPOINT can only be used in transaction blocks   Seems like I need to batch the microflow into smaller parts. My workflow gets the list from SAP and iterates through every record and adds it to a list. Once all records are in the list, the list gets commited to a local entity. Here’s what it roughly looks like: Now I do not know how to set up the batch. I want the microflow to only load 5000 records at a time, commit the objects and then repeat the process until all records are commited.   Anyone who can help me out?   Thank you very much!
asked
1 answers
0

Hi Markus,

Batching sounds like  good option in case of 160K records. Maybe this resource can be of value?

https://medium.com/mendix/performance-modeling-in-mendix-batch-processing-1ed245a1e23b 

I hope this helps.

answered