[Feature Enhancement] Schedule 100k Workflows with Schedule API Workflow on a List

@Sarah_Esteve can you clarify this? By “not respected” do you mean that the interval has a default / baseline interval of more than 5 seconds even if we can set it to less than 5?

Yeah this issue was already present even before the recent performance update. Usually, as a workaround if I encounter this use case, I restructure my DB to add another data type field Thing B for Thing A. Helps a lot to remove possible race conditions.

Maybe you can do this instead?

1 Like

I’ve considered this but this would mean using Search for’s to display these records - as we have 1,000’s of Things B in the database this wouldn’t be great in regards to performance and WUs consumption. I’m going to go with @aj11 's approach but I was hoping not to have to go down that route

When you run the schedule API WF on a list for your Thing As, will each of the Thing A have different Thing B? If no, then you wouldn’t need to search for Thing Bs, you can just pass it as a parameter.

@ntabs oh you mean in the API workflow itself - yeah that’s what I’m already doing

I mean all wf are running in parralel if less than 5sec of interval, I already report this a few month ago @ambroisedlg, try to run with 2sec interval you will see that everything is running with zero sec :rofl:

since a while now I know from support that you cannot write into a list field at the same time. Best is to trigger a new wf at the end (last item created) and do a search for to write on the list field

1 Like

Hello, I too have problems
Its work on testing mode (10 users) but not in live (1k users, cheking db every 20sec but nothing change)
If anyone can help me, thank you.

Thanks Bubble team and @steven.harrington … these are really important scalability and reliability enhancements.

Not sure if this is on the roadmap or compatible with Bubble’s workflow based architecture… but some database operations I’ve performed in other relational DBs are “one command” steps rather than 100,000 actions.

For example,
“Delete from [table] where [condition]”
“Update [table] Set Column1 = True where [condition]”
“Truncate Table [tablename]”

Is there any plan to consider ways of interacting the Bubble database like this?

I do have a concern about a 100k row operation costing 12,000 workload units if it is something the underlying database technology supports in a single command, but it’s simply a Bubble architecture issue. I haven’t looked closely but I believe other no-code database solutions support these kinds of operations.


This is absolutely on our radar. We recognize splitting up simple data updates into individual operations is not the most efficient way to accomplish them. However, the workflow-based improvements have allowed us to improve support for a broader set of use cases, so we made the tradeoff to start there.

We are now actively exploring and planning new ways to enable bulk data updates specifically, which will likely be more in line with the method you are describing. More to come later this year!


Hi Sarah,

The interval you set is respected by Bubble, even as low as zero (see docs).

There is no inherent problem with scheduling workflows that contain many actions. It may take longer for all of them to be completed, but the scheduling and execution will support this without issue.


Hi Eli,

Yes, Bubble has logic in place for Dedicated instances to throttle the execution of the workflows if needed to ensure they run successfully and your app remains functional. The scheduling is controlled by you, as you set the start time and interval. However, in cases where your app is throttling the execution, the workflows may run somewhat later than scheduled.


@steven.harrington I encoutered this probleme and report it to bubble support, so yes it happen that the interval is not respected

I think that Bubble just has not taken the time to setup the code structure so as to handle this for us correctly. If step 2 is referring to an object from step 1, then step 2 should not be running until step 1 has completed. When it comes to a simple add of a single object to a list, there should be no race conditions that cause a problem because logically, why would adding value A at the same time of adding Value B to the same list cause value A to not get added properly…it, like you said, just doesn’t make sense.

@aj11 was this on a recursive backend workflow or did you do this on a schedule backend workflow on a list? I’m curious if there is a value available in a dynamic expression on the backend workflow to set the condition to ‘this is the last item’ when we use the schedule backend workflow on a list.

1 Like

Workflow on a list.

Okay, so is that something like this below?

On backend workflow have two of the parameters be of the same type.

Screen Shot 2024-04-10 at 12.08.27 AM

Then on any action within that backend workflow that I want to run when the list has finished processing I use the condition like below

And when scheduling the backend workflow on the list set the last-item in a similar way as below.

Screen Shot 2024-04-10 at 12.08.54 AM

I believe this is what you are talking about and I think that is pretty nifty. I had always just assumed there was no way to know when the list has finished processing when scheduling backend workflows on a list and this is a novel idea to be able to.

Next step now is to check out how much WUs are consumed just to check the condition and try to find a way to reduce it if possible. I think that each loop the backend workflow runs through (ie: how ever many items are in the list to run on) the parameter of ‘last item’ is going to be fetched from the DB and racking up some WU consumption (image of Cookie Monster eating a bunch of cookies just came to mind). I’m going to end up checking out if sending just the unique ID of the last item and using the condition of 'item unique id is last item'where last item is just a text parameter and not of the same type of data.

It would be interesting to see if there is any WU consumption difference if I set it up like below.

Screen Shot 2024-04-10 at 12.17.11 AM

Screen Shot 2024-04-10 at 12.16.36 AM

I’ve tried it how you had it above and for some reason it sometimes will fail.

I always have the “last item” field a yes/no field and then have it be “list:last item is this item” in the field when scheduling.

Then trigger your action based on the last item field being “yes”. That has worked well for me every time.

1 Like

That will also reduce the WU consumption, in addition to your use case of ensuring it doesn’t fail.

Thanks for the details. Sounds like a must use approach with the schedule on a list, especially now that the ‘race conditions’ will creep in much more often with the new performance enhancement, plus it doesn’t seem as if Bubble is going to mitigate the ‘race conditions’ issue for us, if it could be done at all.

1 Like

I think it’s fine the way it is. The mitigation path is there just by putting an interval in for 2,3,5 seconds etc.

Keep in mind, even with the “last item” approach, there is no guarantee that the last item actually runs last if Bubble is trying to run all at once. All it really does is guarantee you can trigger a single sequential workflow around the time that the list should be complete.

It’s one of those things where we all built around an inefficiency, and now have to adjust. Removing the inefficiency was still the right move by Bubble regardless IMO…

I think adding a “create items on a list” action is the next most impactful move Bubble can make in this realm of features. This will negate the need for workflow on a list in many cases and allow it to be accomplished in line instead of asynchronous.

1 Like