Microflow Works Locally but Throws Error When Deploying to Private Cloud

0
Hi all, I am facing an issue in my Mendix application where a microflow executes successfully on my local server, but when I deploy the same application to a private cloud environment, the microflow throws an error during execution. Scenario: In this microflow, I upload an Excel file and validate its data using an API. When the Excel file contains 70-80 rows: The microflow executes without any issues both on my local server and the cloud. When the Excel file contains over 100 rows or more: The microflow executes fine on the local server, but when deployed to the private cloud, it throws an error.
asked
1 answers
0

Hi Rushikesh Chitale,

I think the biggest differences of a remote system vs. your local system are the available resources. I cannot read your microflow, as it's too small. Are you using the Excel Import module from the marketplace? And are you importing xlsx or xls? Can you inspect and share the logs about the error? If it's a memory issue: can you share the the total RAM allocated to your app and the amount of RAM used "at rest"?

With the little information I got, I will do a bit of guessing: You are importing an xls file, that needs to be loaded into the memory all at once, and depending on the amount of columns, this can fill up quite some memory. Having a 1 GB memory container running on your kubernetes cluster, this is quickly filled with the Mendix model (>700MB). Some list retrieves and changed items in your memory is ok, but loading an xls will blast you out of memory.

If this is the issue, consider importing xlsx and/or increasing the memory of your container.

In any way I would recommend to split up your microflow in smaller pieces (submicroflows), as no developer wants to maintain a microflow with this many activities.

Good luck!

answered