java.lang.OutOfMemoryError: Java heap space ON CLOUD

I am trying to import data in my application using an Excel list. This works quite well so far. I read each row and create a dummy entity from it. Then I iterate over this list to build my data structure from it. After I have created this list of about 2000 entities, I try to add a checklist of about 140 points to each entity. For this I query the data from the server, iterate over the whole list again and create the list for each entity. However, I get problems. Although I save the changed data in a list, and only commite the list at the end, I always get the same error: java.lang.OutOfMemoryError: Java heap space The error only occurs when the application is running in the Mendix Cloud. Locally I have no memory problems.
1 answers

The problem is that Mendix needs to be able to revert all your actions in case of an error. Now there are a couple of things you can do. The community commons has some actions you can use. You could use StartTransaction and EndTransaction in each iteration of reading data. This way Mendix does not have to keep those in memory. Do note that even when there is a problem in your microflow those objects still will be in your database. Now since you already use a dummy entity (what I would always advice with these kind of imports) this will not lead to problems in your case. Now the next step is processing your data. You could do that with custome retrieves where you process your data in certain amounts (with amount and offset). Depending on how much you do I would iterate with an amount of 100 at the time. Do not forget to commit each iteration before retrieving the next set of 100. If you do not encounter any memory errors again you could try raising the amount but you would probably only see a marginal better performance when using larger amounts.