I have read the forums over the last few days regarding this topic and can’t seem to find a definitive answer.
Basically, I have a list of things called “Interval_reports” I am pulling data from GPS tracking units using an API and storing it in the Bubble database. there is no media only 4 text/number fields. The problem is that I am now sitting with over 250k entries and no matter what I do I cannot seem to bulk wipe/delete (paid plan)
I have tried using a workflow (Delete a list of things on button click) this just eventually times out the app and doesn’t delete a single record. I am not sure how to configure a backend workflow to achieve this or if that will eleven work. Besides this my database is tiny.
I have enjoyed Bubble for many years and this truly has been the only major frustration I have had. I don’t understand why it is so difficult to delete thousands of entries using a simple function. Am I missing something or does anyone have a solution around this? I’m sure I am not the only user who deals with huge amounts of entries that require deletion (especially when testing)
Try to delete the list of things by few chunks and check for the old records and do the same for deletion with list of chunks.
But when deleting lot records in a single workflow will won’t work. Spitting into chunk of process can surely works, I’m using this procedure for removing large number of records.
Place a button and call schedule an API workflow on click. In API endpoint add an action delete a list of thing and provide the desired data source. Than just click that button once from frontend, it will delete all the record in backend and also you don’t need to stay on page to wait for workflow completion.
How would I go about doing this? I have placed a button, the schedule an API workflow however there are no workflows listed. This is the backend workflow I would like triggered on button click. I’ve probably not done this correctly. I don’t use backend workflows in bubble often.
Note I have an option set (Heavy, Medium, Light) that has numeric values I use to tune the loop speed. This is a simple loop so the delay is .4 seconds.
I can get this loop to delete about 4.2K records per hour and will give you around a 40% server load with the delay above.
I typically use a set for delete flag to mark the records so they don’t show (which is instant to the user) , then run the delete at night so it doesn’t overload the system (you just schedule the first workflow to run that evening and the loop takes care of the rest)
Best of luck!
You need to loop like this rather than work on a list as the number of variables you are passing (your ID’s of the records) is too big to do this any other way.
This type of loop may become fairly expensive to run in terms of WU’s with the new pricing model so just keep an eye on it. Its a bubble limitation that we have to do it this way.
Also if you are on the new pricing model you may not have to worry about setting a delay an this all should just handle the process and you won’t get timeouts or overloads any more.
I think I did it. Create a backend work flow and ask the system to delete your thing UNTIL item #50. Then reschedule the same workflow (current date/time + 4 or 5 seconds). Repeat until no more records left.
Just delete the development data if you want to completely erase a table and restore that to live. Helpful incase of erasing a specific table for reupload
Hi all. Yesterday I had to delete 200k + rows and found a method that far surpassed others if you’re ok losing the info of created_date and modified_date from the data you want to keep. What I tried first and why I chose another solution:
Deleting via the editor → crashes if you try too many rows at once; too time consuming
Workflow to delete list of things → too WU intensive
Recursive worfklow → didn’t try, but in theory should be the same cost of WU
What ended up working:
I was trying to delete the rows of sample_table, so I
created a new data type in Bubble, e.g., sample_table_new.
Then I replicated all fields and their types, with the exact same field names.
Then, I used the search tool to search for “uses field” and tracked down any element/workflow that referenced the old table. I would update those expressions to reference the new data type. It didn’t take long, and the best part is, bubble automatically applies all the field references from the old table to the new table, if they have the same exact name and data type.
Create new data view with the data you want to keep.
Export data from this data view
Upload data into the new table. This is where the main limitation comes in – you will lose created_date and modified_date from your original table.
Update all privacy rules from the new data type to reflect those of the old date type.
Delete data type
Rename new data type to the same as the old one’s name, in case you want Data API connections to continue to work.
Hope this helps. It was by far the most reasonable solution for me.