Batch data processing, high workload usage and reporting

I am looking for some advice on how to process larger amounts of data inside bubble without creating large WU utilisation.

I have an app that is used at cycling events to record rider times - it is popular and has been used to record over 30,000 results so far (https://resultsheet.co.uk/)

One part of the app keeps track of “points competitions” which involves taking results from the 30,000 results table and preprocessing them into another set of database tables that drive tables on bubble pages. This process is a set of workflows that involve searching for results in the big 30k table, “grouping by” a number of fields and then inserting those results into a destination table. This destination table is then used as the primary data source for the event series page, here is an example of the output: Event Series | East Sussex CA 2023 Demo Series

This process took ages to create as bubble kept timing out, so I had to slow it down to the point that it now runs overnight. This is fine, however the workflow process also utilises a lot of WU, mainly from the database functions like “insert into table”. It is consuming around 20,000 WU per night which when the WU pricing comes into force on October is basically going to kill the app as it doesn’t generate enough revenue to cover the cost.

So my question is what is the best practice for this type of scenario? My preference of course is to keep everything in bubble, I can probably tweak the workflows a bit but they will still be slow and probably consume too many WU, so I don’t think it will move the needle enough.

Am I at the point that I need to consider an external database like Xano? If so, what is the process - post a copy of the data in bubble to Xano, have Xano do the processing and then post the summarised data back from Xano so that it can then be used by bubble again?

Any thoughts on how to address this issue? Thanks to everyone in advance, I am open to all ideas :grinning: