1T - CSV Uploader import limit?

Hey Folks,

I’ve been using Eli’s awesome plugin for a while now to bring in some complex CSV files on a web page.

Our client (bless 'em) has gone a little nuts and and gone from regularly importing 3700 records last month to trying to bring in over 5000 records and the import seems to timeout.

The fields are ID’d correctly, but when we process records, I see the first pass of processing go through then it stops. Usually I see additional passes equivalent to the number of files it needs to create in 100 record chunks) and I see the updating file logo just keep going till I refresh

Can’t see anything in the logs, cant see any errors in Chrome console. My only fix is to break the file into 2 sets and import, then run a merge later. The original CSV is 373K, I can successfully bring in 4000 records that equates to 276K. Not sure if the limit is row based, size based or something else.

Im running on the professional plan. Any suggestions for where to look?

Cheers John

Hey @johnnyweb! Find a workaround for this one yet? Pinging @eli as well.

I’m stuck with the infinite loop of uploading a file with just over 6k records. :crazy_face:

hey @mac2 unfortunately not yet. My only workaround is to split it into 2 loads and rejoin with a manual workflow. Not really the best solution

1 Like

Hey guys, so this is not well documented which is on me but Bubble doesn’t allow us to create more than 50 files to in a single plugin action run. I’ll add this to the documentation and expose an error for it as well.

Ultimately, though, whatever your batch size multiplied by 50 is the limit of records that can be uploaded by the plugin.

100 per batch = 5000 row limit
200 per batch = 10000 row limit

Hopefully that helps.


Awesome - thanks @eli , that gives me something to chase :+1:

I expected it was due to a limit protecting itself but didn’t’ know where to start chasing


Confirmed changing the row limit fixed the issue - thanks @eli

cc @mac2


@johnnyweb @eli