Store large amount of Data in Bubble retrieved from API too slow

Hi all,

I would like to create a list of things based on data that I receive from an API. I tried both types already: Schedule API workflow on a list & Schedule API workflow via iterations, however it is soooo slow. It took 4 hours to create 600 entries in the DB.

This is my current backend workflow:



Any suggestions on how to speed this up? I would like to input several thousands of datapoints in the future this way.

Thanks for any hints!!! :slight_smile:

Best,
Philipp

My first question is: Do you need to create an item for each player?
Can you explain what you are doing with the data?
I see a lot of user that create stuff in a DB while they don’t really need that.

Hi Jici, I saw a few of your API related answers already! Let’s see if I can explain my situation properly:

We want to pull a list of soccer players from Apify every two weeks via an API Call. For this, we want to create a data type “player” for each player that we pull from the API.

I have an API call set up now, that gets 1400 items from Apify. My aim is to store this data in the Bubble DB. Why store it in the first place? We will run complex search & filtering + we will expand the project to thousands of players in the future + creation of report for players etc.
To me it makes sense to store the data in the DB for this use case, but I’m more than eager the hear your opinion. :slight_smile:

1 Like

I followed the video of Jacob already: How to Create Data in Bulk in Bubble.io | Bubble.io Tutorial - YouTube

However, my data fields contain dates, images and even list of strings, hence I have no clue on how to format this text JSON properly so that I can store the data in my DB.

In your case, I’m not sure there’s a better solution. Having more capacity can help, but the best is to iterate one by one like you are doing.
Sometimes, using a third party tool like make/integromat that will iterate the JSON array for you and connect to Bubble DATA API may be faster.
I’ve never test to compare, but it may be possible that a CSV import will be faster. So maybe you can use a tool to convert the JSON to CSV and use import function instead.

Thanks for these ideas. I will have a look at the alternatives in this case and see what works best for my use case! :slight_smile:

1 Like