Best practice of large data migration

0
There is very large data in old system.(I don't know the kind of DB yet..) I have to migrate that to mendix. I checked that a modeler made an index and a unique key on the table of DB. I think general method is below.  1.Remove an index  2.Migration  3.Attach an index But in MxCloud, I cannot operate DB. I am worried in shift taking time extremely. I would like to know the best practice of large data migration . Please let me know experience. 
asked
2 answers
0

If you want to omporting data in Mendix DB, you always need to go through the Mendix Runtime. Indeed as you notice you don't have access to the database in the Mendix Cloud. Mendix choose this construction for a couple of reasons and the main one is security, but also the database structure sync what the Mendix Runtime executes based on your domain model.

The best practice I use the most of the time, depending how large the data is. Is to make a branch-line where I build the migration logic and removes possible indexes. First you dpleoy the branch-line and executes the migration and after that the original app of the main-line.

answered
0

And it is always good to do a local migration first. This way it is easier to see which commands are issued on the database during the migration. Anlyzing these can help in creating and adapting the migration logic.

Ronald

[EDIT] I think that this part of the documentation answers your question: https://docs.mendix.com/tips/migrating-your-mendix-database

answered