Is 50.000 entities to much?

I've deployed my App in the mendix cloud and initially it worked fine. I had some trouble when I wanted to upload all relevant information. That would be a little over 65.000 entities storing 12 attributes per entity. Now I found that one attribute was not filled in the previous upload so I wanted to reconstruct it. When I ran a microflow that was supposed to use those 65.000 entities, the app would give a connection error and now I can't log in anymore. So my question is, can I expect it to be an error in some other manner or was the microflow to big to launch in the cloud environment? Should I consider some limit to the calculation power of the mendix cloud?
1 answers

Is it 65,000 objects (for instance, 65,000 customers) or 65,000 entities (65,000 different types of objects inside of Mendix).

I am guessing it is 65,000 objects.  If this is the case, and your microflow retrieves all 65,000 objects in one retrieve action and then tries to loop through those to do some processing, you are likely running out of memory.  I would suggest running this in smaller batches.  I have found the Community Commons Java action executeMicroflowInBatches to be useful when processing large numbers of objects.  You can create a microflow that processes a single object, and then have another microflow that uses this Java action to call that microflow.  You can specify how big you want the batches to be (maybe 500 or 1000 objects at a time).

This is just one approach - I am sure the community has other ways to accomplish this type of batch processing.