Data Base with too many entries

Alright… so I have a data base running for a few days now and already has over 50.000 entries. I expect every month around 300.000 new entries. All my reports/ dashboards will be generated from this data base but i’m little worry that it will quickly get too big.

So I was thinking maybe every month I could reset the data base… Maybe transfer the entries from previous months to another table that won’t be access that often. The main issue here is that the “copy” functionality from Bubble only allows me to copy up to 100 items.

I’m trying to think what would be the best way to work with big volumes of data in Bubble.

Any ideas?

Schedule API Workflows once a month to go through your current list and transfer them over to another one might be a good way to go. You can transfer and delete all in one go


Tks for your input… Quick question, how would you transfer using API Workflow?

Usually for small loads I convert things into a list and then process the list… but i don’t think it would work well with 300.000 itens. Whats your approach to transfer from one table to the other?

How instantaneous does the transfer need to be? You can schedule the API work flow on a list and have it run every 5 seconds, which for 300 entries would take like 25 minutes.

Another solution would to be keep a “running” data base and a full list data base. So you make a running database that only keeps 300 entries, everytime a new one is added the oldest one is transferred to the full list database. That way you don’t have to do the transfer all at once.

@ryley.randall it doesn’t have to be instantaneous.

I’m aware the approach you mentioned but for every 300 entries it would take 25 minutes as you said so for 300.000 it would take 25.000 minutes. A month has aprox 43.200 minutes… so I would be doing this process for about 17 days. Its not practical since it is a very CPU intense job… My app would be on its limit for most of the month.