API workflow: Making changes to 2M rows in database table

I may need to make changes to 2M rows/things in a database table.

Using API workflow.

Is this feasible?

What is the number of rows maximum should I do at a time, should I decide to break it up?

1 Like

For starters, you’ll want to use a recursive workflow. This means that it tackles one change at a time. It’ll take longer, but at least you won’t have to worry about maxing out your capacity and the rest of the changes failing.

Depending on what the change is, you’ll probably want to schedule the next API workflow (the last step in your recursive workflow) to be at least 5-10 seconds after the current date/time.

That way it gives the workflow enough time to complete the first one without scheduling a new one.

My recommendation would be to copy your live data to dev and try on your dev version. Once you’re happy it works well, then you can deploy to live and run it on live.


Utilizing the data API buk edit / create will be quicker than recursive wf


How do you do that?


Even if I set it to 5 seconds, it would take more than 115 days to complete!

Copying into my development DB is a good idea, but I’m not sure if copying 2M rows will crash the server too :sweat_smile:

1 Like

Thanks! Bulk edit is definitely faster but there’s no visibility into the progress, it may crash the server or hang midway?

Very possible. Would need a queue system, last updated date field probably, and do the in bulks of 100,300,500. Max it allows is 1000 per bulk but that’s where queue comes in and doing smaller batches for stability.

Bubbles database is tremendously underpowered even on dedicated for handing large volume of records. I’ve been testing for weeks to even find a scaleable way to create bulk groups of 10k records and their are pros and cons to each way.

Personally if you’re dealing with that scale repeatedly if possible take it to Google cloud or aws.

Honestly bubble just needs to fix the stability and scalability of bulk adjustments/creation/deletion with the database, looking through forums it’s been a problem for a couple years for multiple users.


Thanks so much, will try it out!

Hmmm, yes that’s problematic - should have run some quick maths on that one!

@chris.williamson1996 when you are building a recursive workflow, would you recommend using Make changes to a thing... and limiting the search for only the first 100 or 300 or 500 items?

Or would you use Make changes to a list of things... and edit 100 or 300 or 500 items simultaneously?

Where is the limit of 1000 per bulk? I am able to bulk edit 13k records (I didn’t tap Run Workflow so maybe this limit surfaces later?)

Thanks so much for your help

Hey Greg, actually when I said max per bulk I was referring to the data api /bulk option.

Which actually got an amazing new update to run 8X faster!

That flow you’re running you can do 13k items

Basically it just pushes the full list through the api workflows on back end one by one, it does it pretty speedy depending on plan.


Hey @chris.williamson1996, is there a way to bulk update using Data APIs like you’ve suggested? I am unable to see it in the documentation Data API requests - Bubble Docs

I see bulk creation, or single data point update. But how to do bulk update?

@mghatiya Not sure about a “Bulk Modify” endpoint yet, I put it on the ideaboard a while back. I let Bubble know and they acknowledged it would be a good idea.

Not bulk edit only create. Edit will have to be a recursive workflow you create.

Okay, thanks @tylerboodman and @chris.williamson1996 for confirmation.

Has anyone tested how fast the bulk create is?

I’d love to know how long it takes for instance to bulk create 10k things

On the legacy Person plan I was seeing something like 1000 things in 50 seconds. New plans possibly could be faster. Either way infinitely faster than a recursive WF creating each one

It used to be hours, now it’s down to a few min. Firebase can do it in under a second though for 10k things.


That’s right! :wink: