Hoping to get some help with understanding how to use Bubble for a specific use case. The idea involves streaming live financial data and then comparing it to previous levels to give customers an edge (hopefully!).
I imagine I’d need to store the historic data in a database: Amazon S3? Google Cloud Storage?
Where would the live data be stored, same place, via an API?
Is analysis done by running a formula in Bubble? Or would it have to be a SQL query?
I think I need help understanding the general principles from a Bubble standpoint - any help would be much appreciated!
Historical information: I would store this in the Bubble database, but given it’s going to be a lot of data, I’d have it indexed by Algolia so the performance is good.
You would need to use an API to bring in the live data and probably store what then becomes useful as historical information in the database
Bubble can do basic math, or you can use a third-party service like math.js to do more advanced calculations.
Having said that, Bubble isn’t the best platform for live streaming data. I think it’s fine if you are handling a lot of data and periodically pulling down fresh data. But if you’re wanting to have data coming in hot, analysed and then shown to users - I think you probably need to look down the route of a coded solution. In my experience, you generally can’t have a stream of live data coming through an API, it’ll just fail at some point. i.e. I’ve tried this with the Twitter API.
Josh @ Support Dept
Helping no-code founders get unstuck fast save hours, & ship faster with an expert on-demand
Not a problem. So it’s possible, not that hard to set up, but doesn’t work out of the box.
If you are on the personal plan you can schedule a workflow to run once a month, and if you’re on the professional or production plans you can schedule it to run daily. There is no ability to set this to a shorter time interval unfortunately. i.e. do this thing like getting new data from this API every 60 minutes.
I haven’t had to solve this issue myself, but I have seen others who have used a third-party service like Zapier to basically keep time and each hour ping Bubble and trigger the workflow to run that pulls data in from an API. This seems like a reasonable workaround and Zapier is pretty good at doing stuff on a schedule like that.
It’s not the amount of data that’s an issue, its just Bubble putting some boundaries on people doing crazy things (I would think). 40k rows also isn’t an issue, and I worked with far larger databases. I think the key issue will be how quickly that is growing? 1k per day? 10k per day?
The more data you put in a database (whether Bubble or another) the more trouble you have with things like performance etc. That’s just the way it is as each time you search it there is more rows of data to sift through and so on, but as I mentioned there are ways you can manage this with services like Algolia - but the point is it’s another thing to consider, configure, pay for etc.
My advice would be to be selective about the data you choose to download so it’s readily available for your app, vs. pulling data down on the fly from the API.
API costs aside, if it was me, I would store only what I need to make my app feel responsive and provide a good experience for the things people are doing regularly. For data that is accessed less often, the best approach might just be to have a loading screen for 2-3 seconds while you pull that data down from the API.
Josh @ Support Dept
Helping no-code founders get unstuck fast save hours, & ship faster with an expert on-demand