The Problem:
I have an external API that returns employee records, but it’s limited to 100 records per request. I need to fetch and process approximately 60,000 records in total. The API uses a cursor parameter for pagination.
What I Need:
A recursive workflow in Bubble that:
Fetches records in batches of 100 using the cursor parameter.
Processes each batch (e.g., saves records to the database).
Automatically fetches the next batch by incrementing the cursor until all records are retrieved.
Key Details:
API Endpoint:GET /employees
Parameters:
limit: Fixed at 100 (max allowed by API)
cursor: Integer offset (starts at 0, increments by 100)
Goal: Process all ~60,000 records without hitting the pagination limit
You will need to use the backend workflows along with the API connector plugin to loop through and get all the data. Or if it is public data without an API key, and you just want to display it on the front end, you can just put it as a data source and use pagination on the front end without even saving it to the database.
Can you explain what your goal is with this data?
Or explain specifically which part you are getting stuck with?
Let us know. Thanks.
Most of the time we don’t need to see all 60,000 records on the page at one time, that’s why I am asking what the goal is.
Goal: I need to process and save all ~60,000 employee records from an external API into my app’s database. This is a backend data migration/initial sync task, not for displaying records on a page.
Specific Sticking Point: I understand I need a recursive backend workflow with the API Connector, but I’m stuck on implementing the loop logic itself. I can fetch the first 100 records, but I can’t figure out how to:
Correctly check if the returned batch has 100 items (meaning there might be more).
Structure the workflow to schedule itself again with an updated cursor parameter.
Ensure it cleanly stops after the final batch (when fewer than 100 records are returned).
I have the API call and the “create record” workflow set up. The challenge is purely the recursive pagination structure in the backend workflow to overcome the 100-record limit. An example or screenshot of a working recursive loop in a backend workflow would be incredibly helpful.
We are not displaying 60,000 records at once. The goal is to have the employee data in the Bubble database so that we can use it in a repeating group, and when a user clicks a button in a cell, we can pass that specific employee’s data to another page.
If the data is only available via an API call and not stored in the database, we cannot pass the employee’s data to another page because the data is not in Bubble’s database and therefore not available as a Thing to pass.
Therefore, we need to store the 60,000 records in the database so that:
We can display them in a repeating group (with pagination, so not all at once).
When a user clicks a button in a cell, we can pass the entire employee record (Thing) to the next page.
Pass the ID and on that page do the api call for that one record via ID…I’m sure the api provider has an api call for single record. No reason to save 60,000 records in database.
Use a backend WF that’s triggered recursively—make the api call, save the batch to your db, then check if there’s a next cursor. if yes, trigger the same WF again with the new cursor. just be careful about execution limits and maybe add a delay between calls so you don’t get rate-limited.