Hi, we have a situation where around 100k objects are being created/sync everyweek via excel importer through excel file. we decided to import this to a another entiry and use task queues to process in batches. But to we still doubt that creating 100k records (even without any keys) still consume all memory and restart the application. I gone through this post https://docs.mendix.com/refguide/import-a-large-excel-file/#using-data-importer-extension-to-create-entity-using-a-large-excel. But i get lost at the last point, That after creating xml schema and mapping, what is the next step? can we take the excel file doc and use 'Import from file'? will it handels the batching or some other stuff that makes sure app won't face any memory issue? Since i was in Mendix 9, I couldn't use Data Importer. Could some one from community help me what i am missing? Thanks in advance.
asked
Thmanampudi Lokesh Parameswara Reddy
2 answers
0
Excel Importer should handle batching of import data. (as long as you're not using 10.2.x to 10.6.x, which was broken)
The linked article is rather focused on setting up templates and data model for sheets with many columns.
Try to store unsalted CRC's in combination with a unique key to detect changes. Feed the CRC function with a concat of all values of all attributes. On comparison compute the CRC and compare it with the stored one. Only process lines that are changed.