Large amount of data deletion.

Hi all,  I wanted to delete around 1,400,000 data from an entity that has around 28,000,000 data. I tried using task queue for deletion however it was an unsuccessful attempt. If anyone knows a better approach please help. Thanks.
3 answers

Did you do it in batches? Using task queue is a good idea, but you still can't retrieve and delete that amount at once.

Retrieve and delete ~5000, if there was something to delete, call the same microflow in task queue again, else stop.

Also check if there are delete events or delete behavior slowing things down. If that can be omitted, you could pass the list to a java action to call deleteWithoutEvents. If not, decrease the amount you retrieve and delete in one batch. If there is delete behavior, it's a lot faster to also retrieve and delete associated objects instead of relying on delete behavior.


Hello Omkar,


Did you try using an limit or amount per retrieval for reference:


Trying using 500 as amount and see how it performances.


For deletion you won't need a offset.


What could influence your performance is delete behaviour, depending on your domain model it can be wise to delete first the underlying objects before the main objects.


Kind regards,




Hi Omkar,

A couple of tips when handling large datasets like in your case:

- Retrieve and delete in batches, for instance 5000 objects. So set an limit on your xpath retrieve (no offset!)

- Count your retrieved list, if it equal to your limit, do the delete cycle again

- Place an Endtransaction and StartTransaction from the CommunityCommons after each delete cycle to free up your memory while the action is running

- If you have any delete behavior configure on your domain module, don't rely on it when handling large volumes because you can create memory problems. Instead delete your associated objects, also with batches from the bottom up.


Good luck!