Asynchronous Object Commits and Mendix Caching Error

0
My goal is to decrease the time of the flat file importer app. I am importing anywhere from 10-50k rows of data in a csv (90 cols wide). I'm trying to batch the records and commit them asynchronously to the database. The issue is that I'm getting a Caching error in my logs. Fetching object from cache for id '[MendixIdentifier:: id=16888498602641785 objectType=FlatFileInterface.ColumnDefinition entityID=60]' in session '6f96cca1-ae15-469e-8528-0cb38a5aece8' timed out.. Here is a snippet of the custom java action: //I already have a list ImportedCols with the objects I need to commit //batchColumns is the size of each batch for (int i = 0; (ImportedCols.size() - i*batchColumns) > 0 ; i++) { //Get batch range int fromIndex = (int) (batchColumns*i); int toIndex = (int) (batchColumns*(i+1)-1); toIndex = toIndex > ImportedCols.size()-1 ? ImportedCols.size()-1 : toIndex; //Commit log.info(String.format("Commiting ImportedCols batch %d-%d", fromIndex, toIndex)); List<IMendixObject> temp = ImportedCols.subList(fromIndex, toIndex); futures.addAll(Core.commitAsync(context, temp)); } for (Future<List<IMendixObject>> f : futures) { f.get(); }
asked
1 answers
2

I recommend doing this by firing microflows asynchronously (using Core.executeAsync). Take a look at https://world.mendix.com/display/refguide4/Garbage+collection for what might be wrong. Your objects may get garbage collected because you're not retaining them and only have a reference to them in your own asynchronous code after the Java action has already ended.

Update: this does not seem to be the problem, I don't know what is then.

Btw, I'm not so sure this would result in a speed increase, since it's still just 1 database that's handling the inserts, so asynchronously inserting these objects might actually cause it to be slower due to some overhead or locking.

answered