GC Overflow error

We've suddently started getting these when trying to process about 60,000 records in batches of different sizes in a microflow. I can't see how the microflow can be causing it and it has been seen to work, but can anyone explain in what circumstances a Mendix application might throw this please? 2010-03-23 15:35:51.792 ERROR - EXTERNALINTERFACE: java.lang.OutOfMemoryError: GC overhead limit exceeded
2 answers

The exception thrown here says OutOfMemoryError which means your application ran out of memory while performing this action.

For batches of 60000 records you could either write a java action which uses our Core API's batch method (Core.createBatch()), or have a look at this solution

You could also simply increase the memory available to the application, but when dealing with large amounts of objects it is advised to use batch processing.


This particular exception actually means too much time is spent garbage collecting. The JVM has not ran out of memory yet, but soaks up a lot of CPU without making significant progress. From the Sun documentation : "if more than 98% of the total time is spent in garbage collection and less than 2% of the heap is recovered, an OutOfMemoryError will be thrown. This feature is designed to prevent applications from running for an extended period of time while making little or no progress because the heap is too small. If necessary, this feature can be disabled by adding the option -XX:-UseGCOverheadLimit to the command line".

A case in which this could occur is when a ton of temporary objects are created in an environment without a great amount of memory available. What's causing this error in this case is hard to say without knowing the exact context, but if a lot of objects are retrieved of created while the amount of used memory is already close to its limit, this error could occur.