Plan B - Backup your Database

I’ve created a service to Backup your Bubble data.


With paying customers on your platform, the worst possible event is the loss or corruption of the data that you store on their behalf.

But can’t you roll back the database on Bubble?

You can roll back your entire database to a given timestamp. In the real world, you will most likely be looking to recover specific customer accounts or records - for this, it’s just not feasible to regress all of your user’s data.

You might also find peace of mind knowing that you have an off-Bubble, fail-safe store of all of your records, should that ever be needed.

How does it work?

  • Once you’ve signed up, you can schedule daily backups of your Bubble tables.
  • You can download your data as CSV files for import back into Bubble.
  • Your backups are stored in AWS and available to you anytime.

As part of using Plan B, you will also get a detailed report of your Database schema, alongside monitoring of significant changes to your Database.

There is a free plugin to install and a registration key to enter into the plugin - start a free trial here

I’ve been doing trials with beta users for a few months now and so far have backed up nearly 4 million rows!



We’ve been beta testing Plan B for the last few months for automated daily backups of our critical tables.

In short, it’s fantastic and it’s never missed a beat.
If your software is B2B or enterprise-y then I’m not sure it’s viable to operate without something like this in place.

What we like about it is…
1) We can now say to customers that all data is backed up every 24 hrs and we don’t have to lift a finger.
2) When we go-live with mistakes which impact customer data - which we sometimes do - we can now roll back only the affected customers / tables / areas. Not being able to do this is a terrible situation to end up in.
3) It feels simple and secure, and @lindsay_knowcode is always v responsive when we have questions.


Excellent developer. This is a great product!


In this case Plan B might just be the best option. :wink:


Love it!

It’s a game changer that gives you a scalpel-like precision for working with your customer’s records, instead of taking a hammer to the whole lot.



  • Better pricing - with a “free forever” tier and more practical usage tiers
  • Beta Database diagramming tools (based on Mermaid | Diagramming and charting tool)
  • DBML - (based on coming shortly - I didn’t quite get it released over the weekend (football went to extra time :wink: )

This is great! Question: does this come with the ability to write SQL queries against this DB backup?

Plan B doesn’t right now but it is possible and I’ve been toying with the concept. Very nicely AWS lets you query CSV files in S3. And the backups are all in S3. So it is entirely possible to query your backup files.

It’s very cool. But I was wondering if anyone had a use case? What are you trying to do?

Like this

To be honest the next thing I am working on for Plan B is productising the >50k rows. But this is interesting also :slight_smile:


Happy to chat about it, but I think the biggest opportunity/gap for folks in the Bubble world is being able to SQL-query our Bubble tables for BI purposes. It’s frankly astonishing how this isn’t possible yet. We write every important table to a MySQL database using the SQL DB Connector and then use Metabase to build dashboards, but it’s incredibly time consuming to set up and maintain.

1 Like

No reason why you couldn’t SQL query your CSV Bubble backups - the daily export is already automated. It’s just a matter of exposing the SQL query side … I always thought a use case for Plan B was for data export and migrations - but as a BI data source :thinking:

1 Like

Hi, this looks very promising!
When exporting, can I filter the exported data by one of the underlying data fields (e.g. created by)?
Asking because a customer of mine would like to have backups of his data

Do you mean within a table, filter out just some rows? Not today. But why not just back up the whole table? And every customer (I am guessing). How many rows are your tables?

How does backing up my data to PlanB affect my bubble WUs?


Nearly hit 100M rows now. :open_mouth:

About to release regional hosted backups - EU or US AWS buckets.


Great to Know !!! Excelent Idea !!!
That’s a major improvment in DB marketing…
Let’s do a Try !!!

1 Like

Anyone have an answer how Plan B impacts your workload units?


@rhea One thing I know from looking at the schemas of Bubble apps is they are noteworthy for their variety. Here are two example apps …

49 tables - the largest is 20k rows
135,000 rows
2979 WU

9 Tables
1 of 47,000 rows
The other 8 tables < 100 rows
1689 WU

So this is the daily cost of WU to backup these two apps.

When I look at this What contributes to workload? - Bubble Docs and then look at what WU number totals are in the Log of an app - I haven’t managed to figure out what the formula is to determine the WU usage for Data API calls.

One strategy to optimise your WU is only to back up those tables that need backing up daily. For static tables there is a button in the Plan B UI to “take an ad hoc backup now” - so just get Plan B to scan you schema then decide which tables you want to backup daily - and ad hoc backup the rest.

On the road map is having more frequency options (weekly, not just daily) to help with WU efficiency.


Nice topic sur
I tried to use it but it didn’t work what is the problem here @lindsay_knowcode ?


Thank you for the reply, @lindsay_knowcode ! I’ll take a look more closely how these numbers apply to my app.


Thanks for reporting @amer - it was just down to the URL entered - the value is just the home URL without any pages.

Thanks for reporting it - I’ve added a lot more signposting and help to make it clearer for the next folks. :slight_smile:

I did a quick scan of your database schema - just two tables and an option set so looks like you’ve just started your Bubble journey - good luck! :+1:

1 Like