Database Speeds

Hi, I am pumped off Bubble and its community. I am just about to hire a Bubble Designer to build our site, but I have a few questions about Database speeds. Any help you can give here?

I have a table of 10M records that have 2 fields in it.

Q1. Is it going to be hard to import that much data into Bubble’s db?

Q2. What are the speeds going to be like searching the db within Bubble. It will be an exact search. e.g. select * from table where domain=‘apple.com

Q3. Should I consider putting this table into some other hosted MongoDB/mySQL solution and then connect to it from Bubble?

Q4. What would you do if you were me and you wanted it to be fast? We have some cash to spend.

Thanks for any help you can give. I am pumped on getting started.

1 Like

The actual download ‘action’ take 30 sec for maximum 100 records. In the editor, 10,000 records took me around 6 or 8 hours, even the file was 100k. Do the math. I wish they improve that part eventually. Good solution to have that kind of data outside (for now) on separate server.
Maybe you can convince people to sponsor that kind of ‘speed option’ or ask Bubble to have a solution with a price.
Let me know if you find other way.
JohnM

Wow. It took you 6 to 8 hrs to upload a 10K list to their database? If I upload 1M records that will be 600 hrs. Count me out! :slight_smile:

If so, an external option is going to be way better. Does anyone have experience on the best way to do this?

1 Like

There’s also the entrance by API Workflow, you set to received data there, let say 10-50 request per second (I don’t know yet), and probably the faster way. Ask Bubble team to import your file?


Thank you. Have you ever connected to a database outside of Bubble? What is the best way to do that?

We’ve used the “run an API on a list” to load data to our database. It took 1-2 seconds per row. So, not very fast.

A faster way is to create the table. Then, create a blank row for each record and put an “item number” into the table and set it so that each row has an incremental number (e.g., 1, 2, 3). Then, load the data through the API and use “Make changes to a list” and then set it to load the list and have it pull “item number = current row’s item number” for each row. This is 5-10x faster based on our time trials for 100 rows (and it’s probably a lot faster at large scale - maybe 100x faster, but that’s just a guess ). We’re also talking with Bubble about ways to make this a lot faster still.

Bubble’s also open to someone sponsoring them creating a quick load capability that’s similar to their current export which is quite fast. Might be cheaper to pay them for this feature than building something yourself. I’d be happy to chip in $100 for this myself. Other community members might as well. So, something to look into with Bubble for sure. If you think you might go this route, PM me and I can help get some community support to cover some of the cost.

Best of luck!
Scott

2 Likes

hi Tom!

We have similar experience as John, but with slightly better results. We were importing ~200K records to a database, and that took few hours too, on a non-dedicated capacity server.

I think that the record size should matter as well - our records were 5 field rows, with some text it it.
So theoretically, 2 field records should be faster than that.

Maybe John’s data was significantly bigger than just a couple of strings per row, that’s why it took longer.

External DB is also an option - there is an embedded plugin - SQL connector that lets you work directly with an external DB. However from what I can see from the plugin itself, currently the selection is limited to Postgres, MySQL and MSSQL only.

As per the query performance - I don’t think there is a lot of examples of live apps having 20M+ records, so can’t comment for sure on how this will work, but we can tell from real world experience that 200k records lookup works exactly as 100-200 records lookup, so we believe that means that the indexes are working well, and bigger data lookup of a simple query should be as fast as a smaller one.

Let us know if this helps, please.
Cheers!

 

Vlad Larin
GM@Bubblewits - #1 No-code Developer & Bubble Certified Partner
  Bubblewits.com - Get in Touch!
  Zeroqode.com - Buy Great Bubble Templates
  Builtwithoutcode.com - Bubble Apps collection
3 Likes

Hi guys,

I’m not very technical (the reason I’m using Bubble :slight_smile: ) and I need to upload a CSV file with 1.3million rows and 3 columns (just a word each). Just uploading it from the backend is taking ages (I let it run for 18 hours yesterday and then it crashed after 150k rows had been imported).

How do I go about this? Not sure how to do it via the API or table routes - can someone expand a little?

Thanks!
Bram

I wish I can answer that question right now! With 1M records (with only 3 columns) it supposed to take less than 10 min on regular download server. As Bubble changed the way to process action on last December, it can take days now. The solution will be splitting the csv file into multiple files (like 100), and run the process via new api workflow Loop that will extract 100 files simultaneously and do the job in 10 hours… I just talked out loud here, didn’t test yet.

Search in the forum how to use API Workflow (with loop calling himself) and look for splitting file into files free software outside. Let me know the result, and maybe I can give you some help on that. Others have better approach?

1 Like