Processing large data set with multiple retrievals from DB

Hi,   I am faced with a situation where I have to collect a lot of data from the app's DB and organize it into a single table so I can then do a custom Excel Export. There are quite a lot of columns and a lot of data retrievals being done. I am doing batch processing and have applied best practices to improve the speed of this export, however, as the data size increases, I started facing longer export times or in some situations even weird timeout behavior which result in the flow finishing, but the file not being downloaded.   I am asking you guys for suggestions of every possible optimization idea that you have, even UI/UX related changes such as leaving the export run in the background while the user does something else etc. However, the end goal is to have it exported FASTER and RELIABLY without any potential issues.    The users do not want less columns, do not want multiple files, they even want to add to this existing export another 10+ columns, so please do not consider this as part of the optimization ideas.   Awaiting and appreciating every idea and response.    Thanks !
1 answers

Hi there,


You can look into OQL. This might speed up your process (OQL | Mendix DocumentationOQL | Mendix Documentation)