Synchronize large user uploaded csv file with database

We have a scenario where our users/customers need to maintain their own data in our Bubble application supplied by a third party as a csv file. Number of rows/records can be up to ~100 000. Context: material prices and description, which they then can create quotes from.

This csv file does not contain any information about the Bubble database, for instance “unique id” or if it already has been added or not.

Ideally we want the user to upload the file in our application, let a background process work its magic, and then notify the user that the data has been created/updated.

We see that there are opportunities through Bubble to either:
a) bulk create data through the user interface (“Upload”)
b) bulk update data through the user interface (“Modify”)
c) bulk create data through backend workflow action (“Upload data as CSV”)
d) bulk create data through the api - Data API requests - Bubble Docs
e) use third party plugins/integration platforms
f) create our own external logic

Is there any easy way to accomplish this? Currently we’re looking into enriching the csv file by having a custom API which looks if the entity already exist, and then adds the unique id to the csv file. But still it looks like we would need to split the data based on what’s being created and what’s being updated. In addition it looks like there is no bulk update available through the API or backend workflow. An alternative is to create an external application which uses the API to bulk create or update.

I see on the forum that there is talk about external plugins/integrations such as for instance Integromat, but I’m not sure if they handle scenarios like this.

Any advice? Thanks!

This topic was automatically closed after 70 days. New replies are no longer allowed.