Hello guys,
I have 400 items. I process them and run them in batches.
I am running a workflow that schedules 50 items at a time and runs in total 4 times. The scheduled items run through various other workflows and after the last item has reached the last workflow the first workflow will be triggered again and run the next 50 items.
Currently I am deleting the 50 processed items AFTER the last item has been processed and BEFORE the next iteration will be triggered (which consumes the next batch of 50 items). But I don’t trust the system and think that sometimes the deletion, which happens right before the next scheduling, will not be fast enough so that the next schedule might pick up items that just not have been deleted yet.
Therefore I thought about deleting the 50 scheduled items right after they have been scheduled in the very first workflow. Is that a problem? Or does the scheduling system already have them “consumed” and has all the necessary data provided and it is save to delete them right away? I think that the reading of the database has been happened and since all items are scheduled in parallel they are probably not reading again from the DB but I am not 100% certain.
Does that make sense? I might add a drawing to better describe what I mean.
Thanks for any help or advice!