Creating 2 million things a year

Hello Bubblers,

I am planning to create a platform where professionals can view and find real-time alerts. I automatically retrieve these alerts using an SDR stick from various locations and then feed them into a database. I’ve got this part working.

However, we’re talking about approximately 5,000 alert messages per day in real-time. That’s about 2 million messages annually which need to be searchable, allow for counts to be made, etc.

Of course, it also needs to be affordable in terms of workload units (WU). What’s the best approach here? Should I handle everything within Bubble? Or keep the data external? But everything must remain real-time, of course.

I hope someone has a good tip or can share their thoughts on this.

Keep it in Bubble unless there’s another good reason not to. The size of each thing is more important for WU than the number of things (aside from the cost of actually creating them in the first place).


Use the /bulk api to do this, it’s 1000x faster than backend workflows

Curious about this Chris, I would have thought backend workflow to be the method here what’s the thinking behind the bulk api? (I’ve never used this action so just curious for learning)

1 Like

Bulk API is just far far faster than recursive.

I’ve not had to create thousands of records at once for a while so this was from about 1.5 yrs ago and theirs been optimizations to backend since but I’m assuming it’s not near as fast as bulk still.

When I had to create 10k records at once back then the recursive took between 4-10hrs. To do it in bulk took under 10minutes. (Firebase could do it in seconds)

So drastic differences. In my case there was other data that needed populated so I populated what I could in the /bulk flow then ran an edit flow to hit the API needed to populate the rest. This allowed for creation of all records quickly to display on front end as the API filled the remaining info.

This may not be the best WU method but will keep things quick if timing is a thought.


Thanks, but the data arrives one by one. Not all data comes at once; it’s a real-time stream of decoded data from a transmission tower that I convert into JSON and forward to Bubble.

Just sending that data already consumes 1.1 WU, so I was hoping there might be a better option for that. For now, we’ll keep everything in Bubble.

Woah, just read your link, had no idea about this, that’s a powerful tool in the toolbox…

Thanks for sharing!