Loading 1,000,000 simple entities on Datagrid

I’m developing on Mendix 9.7. My PC is i7-9850H, 32gb ram, win10 ent. While testing some performance stuffs on iteration (using local H2 database), just noticed after creating over 1,000k objects (the entity is pretty simple with just 3 columns with indexing, no complex xpath used), loading data on Mendix’ default datagrid (10 records per a page, not all at once!) becomes almost unusable. I thought the pagination would solve this issue. It’s a little bit better on custom datagrid with server side paging, but still way too slow. Explanation or any tips on this matter would be appreciated.
2 answers

The local H2 database is not suitable for large datasets. It is a java process which keeps all records in memory.

If you want to test with 1M records, the best way to do this is to install postgresql on your laptop. Configure your app to connect to the local postgres in the setting window. You should have no problem running 1M records or more with Postgres. With this amount of data it is also important to have the right indexes on your attributes, you can configure this in the entity dialog.

I find the easiest way to run postgres is by using docker compose. You can use the following docker-compose.yml configuration file to start postgres:

version: '3.2'
    image: postgres
      - "5432:5432"
      POSTGRES_USER: mendix
      POSTGRES_DB: pgdev




Is this something you will need for user requirements or just testing you are doing?  If your users are requesting on page access to 1,000,000 records, I think you may need to understand better what they are trying to accomplish. 

You can provide search capabilities with the datagrid and set the datagrid to wait for search, which will give your users access to all database records via search criteria.  Maybe that meets their requirements?