Holding reference data in memory for Java actions

I've got some Java actions which load a bunch of stuff into HashMaps in memory and then look objects up in the HashMaps from time to time, so that we don't constantly have to query the database to see whether objects exist or to retrieve them. This is done for performance reasons with code such as this: // Get all tasks and put them in a map public static HashMap <string, imendixidentifier=""> getAllTasks (IContext c) throws CoreException { List<imendixobject> allTasks = Core.retrieveXPathQuery(c, "//" + Task.getType()); HashMap <string, imendixidentifier=""> map = new HashMap <string, imendixidentifier="">(); for (IMendixObject object : allTasks) { if ((String) object.getValue(Task.MemberNames.name.toString()) != null) { map.put ((String) object.getValue(Task.MemberNames.name.toString()), object.getId()); } } return map; } ... HashMap tasks = getAllTasks (getContext()); ... IMendixIdentifier taskId = tasks.get (someKey); My problem is that the HashMap is being created from scratch every time the Java action runs, and this is taking quite a long time. What I'd like to do is run it once when I start the server and hold it in memory so that it can be accessed during all subsequent runs of the action. The question is this. If I build the HashMap in a separate Java action which runs at server start time (or whenever I like), could another independent Java action activated by a microflow then access the HashMap I created in memory earlier, and if so how? If I could do that, the construction of my lookup tables would only have to be done once when I start the server.
2 answers

Although this is not recommended (early optimization is almost always evil) you could create a static hashmap. The whole mendix framework runs in one JVM, so if you create a static hashmap it will remain in memory.


If your Java action takes such a long time to build the hashmap you should be wondering if this construction of retrieving all tasks first works for your situation.

A specially when you are using some static hashmaps that could create a lot of new and complex problems. What happens if one user is creating a new task, the id of that task has to be added as well. What happens if two users at a new task at the same time, a hashmap isn't synchronized, that means that when two users access the map at the same time the map won't receive all changes.

What I suggest is accept that it takes such a long time, or make some drastic changes in your Java and while iterating over your dataset retrieve all required tasks in batches when you need them instead of placing them all in your memory.
If you choose to change your code the structure will probably look something like this:

 Retrieve dataset

 foreach Row in dataset    
      put Row information in cache

      are there about (100?) Rows in cache  //The amount depends on the complexity of your objects
           Retrieve all tasks required tasks for the cached data
           find or create the object
           set the reference to the tasks
           commit the objects

      continue with the next row
 close the dataset