Avoiding Race Conditions When Running a Workflow on a List?

Hey there,

I have a search workflow that

  1. Creates a “Search” object
  2. Searches via an api call.
  3. Returns the results.
  4. Runs a workflow on the list of results
  5. That workflow then creates a "search result’ item and adds it to a list in the “search” object.

The problem with this is that it sometimes seems to create a race condition, where two of the results objects get added to the search object at the same time, and one of them therefore gets overwritten/doesn’t get added.

The workaround has been to create sufficient time between each item on the list, but this obviously is suboptimal as it slows down the search a lot. Any way to handle this better?

I’ve built queues to handle race conditions and it’s a pain but doesn’t seem like you need that. Referencing earlier steps in the api workflow should help. For better solutions plsshare the steps of the workflow and the different moving parts

This doesn’t work because I’m running a workflow on a list, it doesn’t know about the other instances of the workflow that are running (unless I’m missing something?)

I came across race conditions as a chronic issue with database triggers. So I used an old school logic method that seems to have worked:

  1. Add a field to your Thing called “IsProcessing” of type number
  2. Add first step to workflow: Terminate Workflow with the condition “IsProcessing is 1”
  3. Add second step to workflow: Make Changes To Thing, and set IsProcessing to 1

This has completely removed the occurence of race conditions in m yworkflow. While your specific circumstances are different, the above logic (something I learned in the 90s coding in C haha but whatever works) should work for you.

Note: if you will be executing the workflow on the same Thing multiple times, make sure to reset IsProcessing to 0 once your desired Actions are complete.

Hopefully helpful!

Putting the api workflow on the list within a custom event should lead to it being completed before next action

I don’t think that will help. It’s just shifting the issue to the processing field. How to determine when the WF on the list is complete?

You can of course do a recursive on the list but that may not help on saving time

I don’t think you can do this? I’m using the circled event type in the backend.

Which runs a backend workflow on every item in a list, there’s no way to do this with a custom event.

I can of course iterate over the list, and only start on the next item in the list when the previous one is complete, but again I’d like to avoid this if at all possible because it will make the search way slower.

Share this workflow, it’s hard to guess what exactly is happening with your original search object

Sure there is. Create a CE that takes the parameters necessary for the WF and within the CE run the API WF on a list.

Ahh yeah, the issue isn’t that there’s a race condition between what happens after the list is run and durng, it’s that the workflows that are run during conflict with each other, so this won’t help much.

So how could there be a solution other than this?

Well, the workflow is quite complex, sharing it may confuse things, but I’ll try to give a simplified example here.

Frontend Workflow 1

  1. When “Search is clicked” create a “Search” object

  2. Return a list of search results from an API call.

  3. Call Backend Workflow 1 on a list, which inputs each of the search results and the search object.

Backend Workflow 1

  1. Create a “Search Result” object, which has the info about the search result from previous step #3

  2. Add the “search result” object to a list field of the “Search” object (

What I’ve found is that, unless I have like 3 seconds in the interval in step 3, there’s a race condition where I don’t end up with all the “search result” objects attached to the “search” object.

There’s a lot of things I’d like to be able to do in parallel for this workflow! I’m doing multiple API calls for instance and having to wait is unacceptable from a usability perspective.

IF the workflows are in direct conflict, you can split the data set so that they aren’t in conflict OR you can use a noSQL DB… but what else would you be looking to do?

What does this have to do with the other issue? noone’s stopping you from doing multiple API calls.

Why you are removing and adding Embedding Vector search result in the same step?
Pretty sure your step 2 is the reason of the issue since you are removing :first item that might be a valuable result without delay in scheduling

It’s confusing but he may be trying to do a FIFO and remove the first item (basically pushing all the others up).
@accounting4 if so it should be one step to reset list; not an add and remove. However, you’re still going to have DB locks.

You have to rethink your DB schema.
Possible solutions include linking to vector search instead of the other way around. Alternatively, if this works, use the CE and just separate out the the list update until after the API workflow on the list completes; there’s no reason to do it within each WF.

Yes, I’ve mostly solved this, although I’m getting some other visual bugs now.

How I did it:

  1. When I call the original API call, I take the results, and put each item from the result that I want into it’s own list, that I call a new workflow on. This way I can iterate myself over the result of the API call.

  2. I iterate over the result myself, by re-calling the the same workflow, and tracking the iteration I’m on. In this workflow, I only do the step of adding the result to the search list. on each iteration, I call another backend workflow which does the heavy lifting for each item, so that I don’t need to wait for each one to be processed before calling the next one.

  3. I call another backend workflow for each item that does the other steps, such as calling other APIs, this way I can very quickly just iterate through adding the