RE: Stuck on this: How to access thousands of DB records and put into JSON?

Perhaps I’m misunderstanding, but sending dynamically generated JSON to the Postmark API seems ideally suited to a recursive workflow, which could operate in the background on a data set of any size. There would be no need to retrieve the entire data set client-side, and it would be much easier on capacity.

I’ve already built a batch processor in pure Bubble, which can iterate over a list of things of arbitrary size and complexity. I currently use it to dump data to a CSV file for thousands of records which pull data from a number of related tables. At its core, it’s based on recursive workflows.

The primary impetus for creating it was the fact that this other approach falls flat on its face due to the hard-coded 30 second timeout.

However, I also use the batch processor to automate a nightly “data flattening” process, which essentially transforms and caches complex data structures into a “flatter” format in a dedicated table, making it more suitable for “interactive” visualizations (charts and graphs).

BTW, in your case, you should be able to get the total count to display client-side but do the retrieval and data processing in the background.

I’ll give my batch processor a try with the Postmark API when I get a chance and post back with the results.

-Steve

2 Likes