Creating large CRM's with Bubble

Hi all,

I’m curious on the Bubble communities thoughts on a Bubble built CRM (using Bubble database) can handle a large CRM. E.g. CRM’s with 1 million + records and thousands of CRM users?

Would a better solution be to create an external DB and access the records via API? Or do we think Bubble can handle such a load, without serious speed issues?

Has anyone achieved this, or any recommendations on putting something like this together (architecture)

You can use the Data Layer plugin to connect to Firebase. I wouldn’t recommend using a million records on Bubble unless you know how to avoid doing a search for and instead send the data between the pages and use the current page’s data.

What’s the alternative to doing a search for ?

Following the basic lessons should explain it well.

That’s my thinking, that bubble itself cannot handle that many records, but if we handle the 1million records in an external database, do you think Bubble can handle such volume via an API etc.

Why is this?

There is some rate limiting to api calls (as I recall 1000 calls/minute, more when you add capacity units, but it is technically easy to do a dB design with AWS DynamoDB and API Gayeway that could create, read, update and delete items.
But with 1000’s of users you are probably on a dedicated plan anyway, and then a million items will be nothing (the backend dB is a postgresql).

1 Like

Thank you Simon, so I take it from your comment that you believe Bubble could handle this kind of volume itself with it’s own DB? (noting we’d need to scale right up as our user base grows)

I would say “certainly”, but as I don’t have a mil items in my app, it would be nice the get confirmation from others. But from a logical standpoint, yes. All non-dedicated users are on the same servers/db today anyway, so adding a million or so rows shouldn’t affect it much (a single PostgreSQL database can handle tens of terabytes of data - a million rows with a handful of standard text columns is about 200mb).

My app uses an external mysql database with around 10 mil items. I have built an api for bubble to connect with this, but I use this method mainly because I have large datasets that gets updated once a month and/or once a quarter. Importing 2 mil items in bubble is impractical, so therefore I have everything in a system of its own and connect to that with the api connector.

Edit: And as you point out, when you scale you might go to a dedicated server with your own database, and then these numbers wont matter at all.

3 Likes

So long as you scale up your plan (i.e. capacity) there is no reason why you couldn’t do it all within Bubble itself.

Simon, this is interesting. I am currently exploring how to spin-off a project from our main (non Bubble) app which will essentially just be providing users access to a significant dataset of over 340 million records, and I’ve been looking at MongoDB etc. to achieve this - but having whitelabel IP issues (noting you responded to my other post). Do you think we could connect via email to discuss this, as it’s clear you have achieved something similar.

Matt

Sure. Sent you a pm.

@matthew2, Were you able to move your app ( Non bubble ) to bubble platform successfully ? Which way did you choose external database with API connection or bubble database ? We’re on the same spot like you were.