Display a huge amount of data from an external database

1
I have an external database with millions of records in it. I want to use Mendix to expose the data in this database to the an end user. The data is updated frequently by a back-end system. Is it possible to display the data realtime with Mendix? And what would be the best way to approach this with Mendix? Thanks, Ben
asked
4 answers
0

AFAIK there is no way to do this currently. There is a way to connect to an external database and read the data (Database Replication module), but no current way to display that data without first importing it. What would be ideal would be to be able to use something like a non-persistent entity based on the data retrieved to show the data without importing it. Perhaps you should create a request for enhancement in the support portal?

answered
0

There is no way to do this natively in the platform. However you should be able to create a java action and retrieve the information you need in that java action and map it to a non persistent object in Mendix and return a list of those non persistent mx objects.

So in short, no easy way of doing it.

answered
0

As kilian writes, you can do this with a custom java action. As input you can give it a sql statement, jdbc connection url, and the expected result entity. In case of millions of records, it's best to retrieve the smallest set of records you need, i.e, use the sql statement to aggregate the records.

I've implemented one scenario where the external database contained timeseries events. Using a sql statement only the records of the last month were queries and grouped by 15 minutes to calculate averages. This results in 31 * 24 *4 = 2976 entities. If you can live with data not being real-time, syncing one per hour or per day and storing the result in entities for reuse in mendix would probably give the best user experience.

answered
0

The database replication module does support non-persistent objects, so it possible to import data into memory first. However you are talking about huge amounts of data, which doesn't sound like something you want to have in memory. You should make sure to identify as many constraints so you can limit what you have in memory at any point in time.


Using the functionality of the table mapping you can setup the mapping to your non-persistent object. The next step is to setup an 'ImportCall' using that import call you can also configure 2 parameters, and you can use those to set a relationship. So for example you can have a persistent entity: Customer, and a non-persistent entity invoice, Invoice has a reference to Customer.
In your table mapping you create the mapping between your external invoide table. Within the import call you configure the parameter to be 'Customer', and the module allows you to select the relationship. You can even specify dynamic constraints using attributes from customer (This way you can automatically add a where statement searching on the CustomerCode).

The next step would be to build a microflow that retrieves the created import call, and passes it to the Java action 'ImportByImportCall', you pass the import call entity, and your customer to the Java. The module will create non-persistent objects and set the relationship.

Once that action has completed you can show all the invoices in a page as you would normally do with a non-persistent entity.


I have used this in the past to query through a customer database. I had a 'Search' entity in which I allowed the user to select different options (like date ranges, status, customer name, etc), all attributes were used in the ImportCall Constraints. By using the constraints I could limit the results that I got back from the external database. This allowed me to dynamically query through thousands of records, show maybe a hundred records in the UI (through non-persistent objects), and allow the user to pick 1 record which I would then copy into a persistent entity for further processing.

answered