Error Maximum run time exceeded, framework is now terminating when running app

1
Hello everyone,   I'm trying create a process import data with many huge files. When running the application in my local PC the import data process is running correctly but it take time to do that. However, when I import too many files, error "Maximum run time exceeded, framework is now terminating" is happened and shutdown the application. Can I configure "Maximum run time" to pass the above issue? I tried set value of "SessionTimeout","SessionKeepAliveUpdatesInterval",... but it did not work.   Does anyone have experience about this issue?   Thank you.   2022/11/01  Updated This issue only occurred when running application on local. I have deploy app to cloud and it run normally so I close this topic here. For anyone want to solve the same issue on local environment please refer to the below Marco Spoel’s answer. Thank you.  
asked
2 answers
0

Hi Pham,

 

I can’t answer your question, but here are my strategies to avoid it.

 

In the Marketplace module Community Commons you can find the section ORM: https://docs.mendix.com/appstore/modules/community-commons-function-library/#37-orm

With these Actions:

  • EndTransaction – This commits the transaction, which will end the transaction or remove a save point from the queue if the transaction is nested.
  • StartTransaction – This starts a transaction. If a transaction is already started for this context, a savepoint will be added.

I use them to set up a new transaction per iteration through a partial scope. See an example of a Microflow where I use it. In your case, I would use it to deal with the list of files to break them down into individual import assignments.

 

 

The second what I would implement is to assign the import process (per file) to the task queue: https://docs.mendix.com/refguide/task-queue/

 

Each file will be imported using its own context (like individual background jobs).

 

Go Make IT,

 

(If it worked for you, please accept the answer)

 

 

answered
0

Usually we limit to import the data with 1000 records by restricting the user to import 1000 records at a time. You need to increase the memory of the environment to tackle such kind of things

answered