Run mode import of data

Having trouble dealing with an api connector call returning results of ~1,000. I need to create a new thing in the bubble db for every result returned, but running into the app too busy error when I schedule the batch to run a workflow on a list. Small sets of data run fine. Considered the posting to /bulk data api, but without the ability to prevent duplicates, that stops me.

I’ve tried breaking into batches, where the api connector only retrieves 100 results, and then I schedule a workflow to run on that smaller list in the future, but even when scheduling 5 or 10 minutes out, I get the app too busy error in the log after processing about 100.

At this point, I am not interested in speed, just getting it done and in the background so users can move on to something else.

What am I missing here? What is the best practice in letting users pulling external data into bubble?

All ideas welcome!!!


+1 on this topic, i think you’ll find its the “workflow on a list”

Workflow on lists for me is really unreliable in almost every aspect

app too busy on a professional with 2 boosts

things not assigning to other things correctly

workflow on lists just stopping for no reason after processing 300 items

conditions if search for thing count = 0 then create item creating duplicates

I have been on this topic without any solution for about 6 months, ive tried everything, sought out help, spent without a joke up to and above 150 hours

I just don’t think bubble is capable of processing workflow on a list and checking and manipulating data within accurately

1 Like

Hey @chad Nice to know I’m not alone, but bummer you are ALSO spending so much time on this.

You’re correct, workflow on a list is just not quite there yet.

A couple thoughts that might help you:

re: dupes. The root cause is not being able to prevent duplicates at the data level, so especially when processing a list, we get race conditions. Happens all the time, and the only fix I know, is unique constraints on the table.

re: things not assigning correctly. For me, this happens when I create a thing earlier in a workflow (only if not a dupe) and then assign it based on a search. For example, I create a thing in step 1 if it doesn’t exist, and then in step 6 I make changes to a thing that searches for what was created or gets the dupe that prevented its creation.
What happens is the change a thing happens too fast. I’m not positive, but I believe actions are optimized to fire off, not sequentially, but as they are ready, so sometimes I can avoid this by putting in Terminate this Workflow actions as a “Mandatory Wait”. Continuing the example I insert a terminate just before step 6, that has a when condition of Result of step 1’s unique id = -1. This way, it waits, but never terminates. I haven’t figured out a more elegant way, but this works.

re: app too busy. Sigh.

We have these amazing api connections that retrieve data, but then can’t do much with them. I assume the good folks at Bubble would like to fix this as well. We need a way to spin through data and manipulate.

I’m up for sponsorship @emmanuel @josh (actually for three things: unique constraints, workflow on a list improvements, and sort results by a calculated value in a repeating group.)


Ken some great tips there thanks very much for the input.
Looking over it I wish I could have better input for yourself.

I had a chat with one of the guys from CoBubble yesterday and he said basically the calls need to be spaced out at least a couple seconds apart.

I tried that last night with a list of 3484 entries where I basically have 4 workflows running to pull data to create categories, sub categories and the actual items.

through a set of dummy workflows and breaking it down into smaller chunks by the items categories (4mins apart) and sub categories (2mins apart) and then do the workflow on the items (3 seconds apart)
I have managed to achieve all key points I listed.
However it took 3-4 hours, in saying that my app was mostly still able to work fine in the process however I am on Professional with 2 extra units.
The only thing that didnt work that well was any other API calls and workflows