LDAP configuration memory requirements

2
I am trying to configure an LDAP integration with AD and have now successfully got authentication through Active Directory working, but am having memory issues trying to set up User Mappings. It seems that each time you try to make a change to the LDAP configuration it reads in the whole of the LDAP database into memory. In my case this is a large database and causes Java out-of-memory errors. It is also very slow. For example, below is an error when trying to to set the field to use for login field (sAMAccountName) and other fields to import accounts. I haven't told it to perform an import - I'm just trying to set up the User mappings. Saving the mapping seems to read the whole LDAP database and causes the error after a couple of hours. I guess I could keep trying to increase the size of my JVM (currently 1536), but it seems to me that there should be a better way of doing this that does not need the whole of the LDAP data in memory. 2011-02-20 23:36:19.450 WARNING - Jetty: EXCEPTION 2011-02-20 23:36:19.450 WARNING Jettyjava.lang.OutOfMemoryError: Java heap space 2011-02-20 23:36:19.451 ERROR - Connector: An error has occurred while handling the request. [User 'dsanders' with roles 'User, Segment01, Requester, Administrator'] 2011-02-20 23:36:19.451 ERROR Connectorcom.mendix.core.CoreException: Exception occurred in action 'Microflow [Ldap.UpdateLdapServerLoginField]', all database changes executed by this action were rolled back at com.mendix.core.actionmanagement.CoreAction.d(SourceFile:553) Caused by: java.lang.NullPointerException at n.c(SourceFile:198) at ls.a(SourceFile:74) at jm.a(SourceFile:402) at iR.endTransaction(SourceFile:399) at eO.executeAction(SourceFile:104) at com.mendix.systemwideinterfaces.core.UserAction.execute(SourceFile:49) at com.mendix.core.actionmanagement.CoreAction.call(SourceFile:473) at it.b(SourceFile:155) at com.mendix.core.Core.execute(SourceFile:191) at dw.execute(SourceFile:183) at ju.a(SourceFile:299) at ju.a(SourceFile:230) at ju.processRequest(SourceFile:174) at fC.a(SourceFile:71) at com.mendix.core.MxRuntime.processRequest(SourceFile:916) at com.mendix.m2ee.server.handler.RuntimeHandler.handle(RuntimeHandler.java:42) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:113) at org.eclipse.jetty.server.Server.handle(Server.java:334) at org.eclipse.jetty.server.HttpConnection.handleRequest(HttpConnection.java:559) at org.eclipse.jetty.server.HttpConnection$RequestHandler.content(HttpConnection.java:1007) at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:747) at org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:203) at org.eclipse.jetty.server.HttpConnection.handle(HttpConnection.java:406) at org.eclipse.jetty.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:462) at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:436) at java.lang.Thread.run(Unknown Source)
asked
2 answers
4

Hi David,

I think the creator of this module will have to look if the performance for large datasets can be improved. Please file a ticket in the Support Portal.

answered
0

Maybe specify a more specific root directory?

answered