DB migration to Bubble

I’m considering migrating an app to Bubble. The existing database is on SQL Server, its logical structure will not change with the migration. The question is, should I import everything into bubble’s database or keep the existing db on a separate SQL server? Besides a possible lack of Bubble db operations optimization, my concern goes also to the development itself - is writing expressions the same regardless of the source of things - that is, Bubble internal database vs. external database? And, of course, the 200 records limitation for select clauses, which means doing majority of operations step by step.

Besides this, in existing db the ID’s are GUIDs, which is different from Bubble’s IDs. How to cope with that, either way?

After some research and testing, it seems that the best way is to import all the data into Bubble. The ease of the “pure Bubble” development cannot be compared to writing separate SQL sentences to perform searches etc with the outside database. The only trouble is that I have to prepare all the procedures to do a bulk API import as a batch.

And the other trouble is translating GUIDs into Bubble IDs, since I have more than 30 data types/tables, which are related in complex ways. Translating means creating new ID for each thing and writing it’s value into all related tables. One option is to do the translation with workflows inside Bubble, the other is to prepare the translation outside. My tests show that if you import Bubble IDs (which bubble created during the bulk API import) as “foreign keys” in related things, this works and Bubble finds the connections nicely, so there’s no hidden information, just the IDs.

My question is - can Bubble-like IDs be created outside Bubble, and are there any rules regarding the consistency? Anyhow, I will do some tests with that, but maybe someone of bubblers has experiences with such migrations.

1 Like

Hi @tomazz.turk

Thanks for your valuable post.

I have around 20,000 rows in a bigger table and a few small tables. Our data needs a lot of manual updates and joins while updating. So having it in a proper database is handy for us.

I appreciate your input not to try to run sql queries from Bubble to get the data. Then how do we refresh our bubble database whenever we update the 20,000 records ? Bubble seems to have a 200 record limit to bulk import ?

(I changed my username :slightly_smiling_face: )

Hm, it’s really hard to say because I don’t know your system, the reasons that you chose Bubble and the reasons for “a lot of manual updates and joins while updating”, as you mentioned. Anyway, you can use SQL Connector, but you have to prepare SQL queries for each of Bubble’s DB accesses (create, read, update, delete). In that way, you will loose some of the best parts of Bubble development, like expressions, DB searches, optimized queries behind the scenes etc. But still, you can develop nice user interface quickly, so it can be done. When developing GUI, the limitation of 200 records is reasonable, but SQL queries should be very specific, giving the lists of things that you show directly on screen.

My decision above was easy, since I don’t have a distributed application, everything can be in Bubble and the app architecture is simple.

PS: the limit is 100 records: https://bubble.io/reference#Plugins.dbconnector

Thanks @eftomi

I was saying we will indeed use the bubble database. The manual updates is like feed we manually need to update as it needs manual verification. So we are going to do all our work in mysql outside of bubble and plan to refresh (delete our bubble database and load it again with fresh updated data) from our mysql database once a week. Think of it like adding selective stock exchange transactions of the previous week every Monday morning of 20,000 records.

The 200 (or the 100) limit to bulk load into bubble is stoping us doing this and wonder if there is a workaround to once a week bulk loads into bubble ? The only alternative seems loading into excel (from mysql) and then loading it into bubble but from our testing this is also has performance bottle necks.

I see. In this case you can use ‘manual’ process of importing the data in CSV format (reference), before that you can use Bulk Action to delete records.

If this will be done on a regular basis, you can use some ETL or similar tools. I’m using Talend Open Studio for Data Integration to take the data from MS SQL and send it to Bubble Data API (reference) by REST API calls. In this way the deletion of data can also be done.