Forum Academy Marketplace Showcase Pricing Features

Is there a limit on how many records can be processed at once?

I’ve got a database of 1,300 people and i’m trying to add a new datatype to them. In one click its supposed to create and attach this datatype to a user. It only seems to do 200 and then stop. I have to refresh the page and click the button again. Am i missing something?

Can you show your workflow to clarify how you’re doing what you’re describing?

Sure. I have a worklow to fix a missing record attached to the user. The workflow runs and activates a backend workflow (finds all users with missing record):

It then creates a looped backend workflow that cycles through each person from the list above. Step one is to create the missing datatype.

Step 2 is to assign it to the user

Step 3 is to repeat the backend worflow until it has gone through all the records:

For what it is worth - I have noticed 2 other backend workflows with similar issues - 2 different apps - a similar thing happening. Stable up to ~150 items or so but around 200 they just stop processing without any error. Even with pauses between recursions. No server usage spikes. I’ve ended up working around by running the BE WF multiple times. I’ve not read of any “200” limit anywhere. Personal accounts. Adding 3 server units for an hour doesn’t make any difference.

@datproto

It’s upto your app capacity units.

If you have less units then it can process less records, if you have more than it can process more records. Buts all depends on records and capacity units.

If you have less columns (5) in the datatype can you iterate Upto 500 records. If you have more columns and datatype have more data then you can iterate 250 per action.

Interesting @mani2726 :thinking: - how did you discover the columns to records limit? Is it a documented limit or something?

Curious where are you getting that info? Unfortunately processing even a couple hundred single datapoint records seems incredibly unstable, many times missing a couple creations or timing out even with capacity never going over 20%

Hi @lindsay_knowcode & @chris.williamson1996 ,

You can try to create simple data type with less columns and do some workflow as bulk.
Example :
Create same table called “Sample 1” with columns of c1, c2 & c3. add more number of records upto 1000 and do some actions via workflow with iteration.

Same thing, you can add another data type with more number of columns with with more values and do the same workflow action as bulk.
Example:
Create same table called “Sample 2” with columns of c1, c2, c3,… c20. add more number of records upto 1000 and do some actions via workflow with iteration.

You can see the difference from above both.

I have got this info by doing some research in the generating reporting from the database tables.

Note: Its all about your records Weight/Size.

1 Like

Thanks for pinging. Did not read the 200 limit anywhere else and had a similar workflow which worked back then (at least I think).

Would also propose backend workflows, but seemingly not working.

Last idea (for now at least): Do a loop in Bubble and iterate through the entries, so doing one per loop - using backend workflows as well, but limiting the rate of changing to 1 per loop. You could add some time here by scheduling the looped backend workflow not to Current Date/Time but maybe to Current Date/Time +0.1 to ensure smooth running - experienced this to be beneficial sometimes, while manipulating large amounts of data.