Import around 100k records using CSV connector marketplace module

0
Hi,  I have a requirement to import around 100k records on weekly basis. Some of the data i need to sync and some i need to create new objects. Previously we used Excel importer module with sync objects option. It works fine. But when the amount of data that need to import increases, It is taking 2-3 hr time and sometimes the application is restarting without importing single record.   I came to know that csv connector can be used to import millions of records with less memory utilization. I gone through this post https://medium.com/mendix/import-export-data-in-mendix-using-csv-module-d3eba261ae49, Where they take the data to a entity and copied it to main entity. But still they commit all the created objects at the end of the microflow. Since microflow only commits the data when it reaches to end point, So commiting those many records will still result in memory issues right?   Could some one please help me on how to import millions of records without make the application going down?   Thanks in advance.
asked
2 answers
0

Hello Thmanampudi Lokesh Parameswara Reddy,

 

From my knowledge it has to do with the CSV action combined with the OQL being way more optimized compared to the excel import action. 

 

For a more technical view and the differences you need to debug the actions.

 

Hope this helps,

 

Good luck!

answered
0

looking for something similar, but for importing have you tried 'create obj for row ' in template it takes less time to import

answered