Hi Sander!
Check out these blog posts about handling large database tables
https://stephanbruijnis.github.io/a-Mendix-blog/content/post/fast-manipulation-of-large-tables.html
https://medium.com/mendix/five-tips-for-working-with-large-database-tables-in-mendix-170210f6f6fd
I recommend doing the batch updates :)
Just to complete this thread :-)
You can also build some recursive logic using this Module.
There is a demo app having an example that deletes records. Updating them is the same. The benefit is, that each batch runs in it’s own DB transaction (Just as with the process queue logic). That makes it independent from the overall amount of objects you need to update.
I concur with Lukas’ recommendations. Another perhaps lesser known option is through the use of the Process Queue module found here. It allows you to trigger any functionality in parallel and queuing it. Just beware of the caveat on using it on horizontally scaled environments.
Sander, if it had to be run only once i would use a microflow an change 500 records at the time, commit that 500, do an end transaction and a start transaction and do the next 500. That will do for a quick change. If it should be used more than once i would use the queueue module