I’ve created a backend workflow to let our users upload CSV files via our web app. It’s been working flawlessly for up to 4,000 rows CSV. It uses the 1T - CSV Uploader plugin, which creates smallers batches (100 rows) which are then uploaded through an API Workflow.
However, we are now trying to upload CSV files that contain up to 100,000 rows. We’ve tried changing the batch size, but the pluging either shows uploading file… perpetually or creates the batches and does nothing.
Anyone has had success in uploading these big CSV files?
Thanks for replying! Our users have access to monthly csv reports generated by music distribution platforms that are super complicated and very often exceed 50,000 rows. We’re processing these reports to create more digestible analytics for them
Not true! This is just lazy programming. My plugin Better Uploader allows you to upload more than 50 in one action. What he’s referring to is that Bubble does not allow plugins to send more than 50 files at a time. This is normal because Bubble wouldn’t want someone attempting to send hundreds or potentially thousands of files in one go.
What the plugin developer must do is send files in batches, wait until they are done uploading, then rinse and repeat until you reach the number of uploads (this is done programmatically in code). In other words, the plugin developer must use the callback parameter on the upload function.
So the answer to this question is: YES it can absolutely be done, but will probably require some code!