How To Import 550,000 rows of data into bubble without 1 hiccup....?

Before I attempt this…

Can bubble handle this size of import?

If not, is there a better way to go about getting the data into Bubble’s back end?

We did 3,200,000 rows, but did it with a script to the API. Fed a CSV to a script that talked to the API endpoint published for that object.

1 Like

I thought of this as being a backup.

Any service that will fire the calls off one by one?

How often did you make the calls? Every 5 seconds?

We just looped through the CSVas quickly as it would go. Depending on how many CPU units you have will effect how quickly it will upload the data, but probably not in danger of running over the 1,000 API calls a minute with a single script going as fast as it can.

Where’d you get the script?

Out of box? Or custom?

has to be written for your specific case, so coding involved.

There is bulk operations with the paid plans that can do it from the backend with no code.

Yes we are on a paid plan.

I just didn’t want to start the import only to have it crap out at 100k entries only to have to start over…

@Bubble is this going to be ok if i use the uploader for all entries?

I’m on a paid plan and used the bulk uploader for for about 300 large CSV files. My biggest one was 424k rows with 13 columns. It took a few hours, but it did complete. If it’s possible to break the sheet into 2, that might increase your chances. I don’t know if that’s possible for you.

Hey @JustinC

The thought make me giggle:

Given a day has 17280 seconds, at 5 seconds a call, that’d take 6 months plus :smiley:

peace,

Ashley

I have strong knowledge in robotic process automation and have setup robots in the past to upload 150,000 rows

If you need a robot to be designed and run to do this then message me please

2 Likes

I did this manually for about 650 thousand rows of data, as well as creating connections between data types and did it all using Bubbles regular bulk upload methods from the database view. It took me about 6 days, I did it mostly overnight and there were some errors, mostly duplicate uploads but these were caught when doing bulk modify operations because it validated the data first. I broke it up into chunks of 150,000 rows.

Long story short, it is doable but it is a bit of a fatiguing process.

1 Like

This topic was automatically closed after 70 days. New replies are no longer allowed.