Most Bubble apps break the moment you try to process large lists.
You hit:
• API limits (100 items per request)
• Recursive workflows
• Backend scheduling chains
• Timeouts and fragile logic
We ran into this exact problem with a client:
~60,000 records
External API allowed max 100 items per request
Needed reliability, retries, and visibility
So we built Batchy Runner.
Use case:
Instead of looping workflows or scheduling backend jobs, Batchy Runner:
• Splits large lists into safe batches (e.g. 100 items)
• Sends requests sequentially from the browser
• Respects rate limits with configurable delays
• Retries failed batches automatically
• Exposes progress, current batch, and completion events
• Works entirely client-side
Result
60,000 records processed cleanly
No recursive workflows
No backend scheduling
No crashes
Full progress tracking in the UI
What surprised us most
The solution wasn’t “more backend logic”.
In Bubble, performance at scale often comes from processing smarter before you overload workflows or the database.
If you’re dealing with:
• Large lists (10k–100k+)
• APIs with strict request limits
• Imports, syncs, or bulk operations
Batchy Runner changes what’s possible — without making your app fragile.