How to Import and Export large flat file

0
I want to import and export about 70,000 lines of data to the application. However, the following error occurs in processing I created.   - Import I import csv File to the application using Flat & delimited file import of Mendix App. I committed every certain number of rows, When processing more than a certain number of rows, the response is abruptly delayed and processing does not end, or Critical level logs on the Mendix console frequently and finish.   - Export I exporting to Excel using XLSReport of Mendix App. After creating the data to export to the persistent entity, I set it to an Input object that passes the data's relation to XLSReport. At this point GC overhead limit exceeded error occurs.   I think the above phenomenon means that memory is insufficient for processing (data). When importing or exporting a large file that will run out of memory when processing, How do you devise a process?
asked
2 answers
1

Hi Chiaki,

I had similar issues with importing using the flat file importer and what I did was use the community commons function module from the App Store.

There are a few Java actions that can help you with running the import or export in the background, for example:

  • executeMicroflowInBackground - Similar to RunMicroflowAsyncInQueue, but accepts one argument as parameter. (new in 2.2)
  • executeMicroflowInBatches (Recommended!) - Performs a batch operation on a large dataset, by invoking the microflow on small subsets of the data, each with its own database transaction. (new in 2.2) 
  • recommitInBatches - Recommits (with events) all items returned by the xpath query. Useful in migration scenerios (new in 2.4)

 

 I also processed the imported data in batches. You can learn about how to deal with batches in the Advanced online lecture Win at Working with Data: https://gettingstarted.mendixcloud.com/link/path/3 then module Data in Microflows: https://gettingstarted.mendixcloud.com/link/module/21/lecture/204.

I would use the batches and running in the background to process the import but also generate the export file and give the user a link to download when the export document is ready for downloading.

 

I hope this will help you achieve your goal to import 70.000 records and exporting them as well.

Kind regards

answered
0

There is so much functionality that you will be re-inventing when building it via the community commons function module. I would advise to fork the 'Flat & delimited file import'-App on Github, get the problem solved and create a pull-request for it. In that way, we all benefit.

How to solve the problem? Have a look at Jeffrey's tips.

answered