Way to query my bubble database with SQL?

I’d like to use SQL to query my bubble database directly.

Can I do this using the SQL database connector? I’d need DB credentials.

Any other ideas that don’t involve writing Search fors. I mainly am needing to do this to validate the Search fors.

is this doable?

Updating my initial response… 2nd, mishav’s answer. There isn’t a plugin or eqv way to SQL query across tables using regular bubble tools. The GET will get you data from a single table. If you’re on dedicated, you can get read-access to your bubble database (logs not included). We’ve done this and use the data to feed our data warehouse.

Apart from the Data API and API workflow, there’s …

Any update on this topic ?

No. Still not doable, Bubble preserves their proprietary referential integrity by keeping users out of direct-querying the DB. Bubble aims to be low-code and zero-experience-needed, and SQL access to the backend DB isn’t on that menu.

A dedicated cluster, with requested read-only DB conn, is the only available remedy, or… you can build your apps SQL-first from the start, using purely the SQL Connector for all data object definitions and CRUD (which is my recommendation).

Thanks for your answer. Unfortunte though, especially if you want to plug SAAS tools on the database…

What specifically were you hoping to do and could it be done by replicating the data onto a SQL database?

Well, I had a couple things in mind :

Have you considered replicating your data to a SQL database. It might not give you all the capabilities you are hoping to achieve but it could be an alternative option that offers some of the functionality?

That’s something that could work for “read-only” replicates yes. Do you recommend something (plugin, good practice…) for keeping an external SQL database quite in sync with the bubble database?

Honestly the easiest solution is to use bubble’s Data API.

Yes. I have a solution for keeping an external SQL database in sync. I will post the details here, but I truthfully haven’t been using it very long and I want to make sure all of the “kinks” are worked out before I post it on the forum.

I have been able to transfer 1,000+ records reliably in under 10 seconds (without any recurring workflows) to an external database and dynamically generate the table schema from the Bubble Data Types.

6 Likes

Sounds like a great solution, I can’t wait for your post !

Looking forward to hearing about it!

me too

me three

Alright alright, you finally wore me down. The truth is:

  1. Since I made this post last September there has been a flurry of activity by Bubble developers using the same “clever” techniques I am using to achieve similar results (I’ll share a few examples with you below)
  2. I ran into some technical challenges trying to keep the data schema up-to-date, which I believe to be an important part of the full solution. However, enough time has passed that I feel like it’s better to share what I have now then to further delay getting this information to interested persons
  3. I’m not sure when I’ll be able to make the time to solve the issues noted in #2. I have ideas on the solution, but it is a fairly substantial undertaking that I can’t commit the time to presently.

First, I want to share a few links of Bubble being used in clever ways, one of which will provide a part of the solution in a similar (and probably better) fashion to the way I am doing it.

(specifically the Floppy Hacker part)

And finally the one most relevant and useful for this post:

All of the plugins/solutions above are using techniques which involve running Bubble built-in data/functions (although not publicly documented) to help accomplish some or all of the task. Which in the case of the last plugin is useful for the purposes of this post as it generates the scripts needed to create the database schema in SQL. That is to say, before we can move data into an external database we need to have tables already created. The DBML plugin does just that for us.

Next we need to create our mechanism for storing the changes that will at some point need to be added or updated in our replicated database. In order to do that I created a table called “TrackChanges”. This table will hold a reference UUID (Data Field - _id) to any data that has been changed within our Bubble app.

Now that we have the Data Type created we can add the appropriate data. We will do that using a series of database triggers. NOTE: Database triggers can eat up your application capacity and lead to performance issues. This setup may not be appropriate for every app scenario. However, the methodology will still apply even if you used a scheduled workflow at a later point in time to add data to the TrackChanges table.

Here I’ll show you how you can use database triggers method but as stated this is not the only way to accomplish this task. The important part of these triggers is the “Only when” condition which defines when each trigger will run. Pay close attention to those differences if you are not familiar with database triggers.

Trigger when a MonthlyData (Data Type) is updated:

Trigger when a MonthlyData (Data Type) is deleted:

Trigger when a MonthlyData (Data Type) is created:

Once you have this sequence created for one data type, you can simply copy/paste it for each of the different Data Types in your application that you want replicated. Notice that all of these triggers are simply creating a new data row in the TrackChanges table. When it’s working properly you should end up with data in your TrackChanges table that looks like this:

With these triggers properly setup, the TrackChanges table will have a row for every change made in your Bubble database. Note that we are not tracking which fields have changed in an “update”, just that “something” in the row has been changed.

Now all that is left is for us to do is to use the UIID stored in the TrackChanges table and the type of change that was made to update our external SQL replicated database. This can be done using Bubble’s own SQL Connector and crafting the appropriate queries.

This post is already sufficiently lengthy so I’m going to pause here for questions/comments.

3 Likes

Tks for your detailed answer. I am switching to Postgres all the way :slight_smile: