Importing large number of records - 3.2M records in a table

We continue to struggle with getting the data from another source into bubble, but we have now after a week successfully brought in 3.5M records in one table and 300k records in 4 other tables.

After many attempts, we realized that the API connection to a shared instance is not perfect, whether it is the load on the bubble server from others or if it is related to any latency across the path to connection. (I am on a 600mb cable modem that does 20mb upload, but seems to vary pretty dramatically).

We ended up writing a php/curl script to talk to a custom endpoint API created in bubble to receive the records and to return an error code if the records is not created.

We initiated 8 instances of the script and dialed up the CPU blocks to get to the magical 1K API/Minute. It ended up taking 11 CPU blocks to do this! It seems like a lot of money just for importing data, but none the less, we have most of the data in after 5 days of script running. :slight_smile:

I understand that since our 3.2M record table connects to two different data types, that this is probably the only way to get the data in, we could not use the backend upload or modify with any success.

Now, finally moving back from data processing to MVP development

4 Likes