Very slow loading of external API call data: how to speed up?

I am running a simple request to a weather API where I send latitude/longitude location and request to receive the historical average temperature for days 1 - 365 (Jan 1, Jan 2, Jan 3…Dec 31). Each row in my database contains (Location, Month, Day, AvgTemp) and cycles through all 365 days in the year…but it takes 13 minutes to finish.

How do I bring this from 13 minutes down to just a few seconds or less? I’ve heard potential solutions of using JSON files, running on an external server or building a server side application. Any thoughts, suggestions or direction would be greatly appreciated.


Which weather API? Is there any way of conflating the 365 requests into one? For example does the API offer a way of getting a years worth of data in one request? (ie return a list-ish type thing of results rather than a single result for a single day or location)

It is the “Climate Normals API” (here). Notice that the data I am calling is just one already-summarized data point (i.e. the “temp” field for Aug 1 is an already-calculated average of the past 30 Aug 1st’s)…it’s very simple; I don’t have to calculate or run on 30 years of data - but rather just grab one data point for each day.

But because I have to send parameters of start_date = 01-01 and end_date = 12-31, it cycles through all 365 days in the year.

I read in the API:

For convenience, up to 1 year of data can be requested per API call. However, requests that exceed 1 month will count as multiple requests against the plan daily quota.

So you could get a years worth of data in a single call and then do the maths in Bubble? … or am I being dumb and missing something :slight_smile:

It seems you have a choice, proliferate API calls which take a long time but are conveniently calculated vs fewer API calls but you have to do some maths. Sorry don’t see any shortcut but perhaps someone else will. :thinking:

Thanks for the help. One call does produce all 365 results for a year, but I suspect it is the loading of 365 rows into the Bubble database that is taking so long.

When I initiate the call on Bubble’s API Connector plugin, I get all 365 day instantly. But making that same API call in a backend workflow to load the data into the database (365 rows: Long/Lat, Month, Day, temp) takes a full 13 minutes!

Here is what the first 2 sets of the full 365 look like…

Ahh ok - what does your Bubble table look like? I am guessing that is 365 inserts? Yeap that will take ages in Bubble :frowning: . You can’t sensibly convert that to 365 columns in a single table to make it a single insert, but maybe there is a half way house to reduce the quantity of inserts.

As an idea - in one Bubble app I store scores, Users have 100’s of scores, there are 1000’s of users, so I quickly get to 100000’s of data points that linear scale. So I store the scores in a single string, comma separated, just to reduce the quantity of DB Things (and accompanying inserts, updates, selects, joins).

Thanks for the tips on using a comma-separated string. Please let me know if you know of a helpful link that shows how to get started on what I suspect are two fronts: (1) how to initiate the “loading” of the data as a comma-separated string instead of a database load of 365 rows; and (2) how I would query for a certain number in the string (e.g. the AvgTemp on Aug 1 for location Chicago).

Below is a screenshot of my data table on Bubble:

It’s going to be messy. The benefit you get of reduced DB Things has a cost of no longer being searchable by the DB. Let’s say it was JSON format not CSV - you could use JS to get the values at a certain position in the array of 365 - the position corresponding to what you looking for. Depending on your use-case maybe it’s easier just to go and get it from the API?

Because you are dealing with 365 (lots) things the way I would do it is create a plugin so that I could manipulate that JSON returned from that API into the shape I wanted (lists) and stick the lists in the DB.

You’d have to think carefully about how you query and use the data to ensure you aren’t just moving the “13 minutes” problem to something that wasn’t a bigger problem in itself. :slight_smile: How often do you need to run the 13 minute job? once/ daily/ hourly /monthly …

Another idea is using an External Database - not Bubbles database - look for posts by @jared.gibb on how people do this.


I just finished (monday) a plugin for someone doing this exact thing! Lemme know if you need a hand!

Thanks @lindsay_knowcode

Awesome, thank you @lindsay_knowcode for the ideas and responses. So helpful.

Regarding frequency of the API call, it varies based on when a user signs up or adds a location…so it could be zero times a day, several times, but most likely less than 100x a day. I do have a different weather API call from another provider with a very similar “13 minutes” problem, and that runs daily for all locations (in the hundreds).

A solution involving a custom plugin or external database sounds promising.

Appreciate the intro to @jared.gibb. I will reach out to him separately to further explain what I am looking to achieve. Thank you!


For API calls with fast response times, I think this is the better solution. Someone share a useful knowledge with me for engineers sunshine coast. Having read this way, I believe it is an excellent method to speed up response times.

Just following up - @jared.gibb was able to fix the 13 minute problem by leveraging Firebase, bringing the time down to around 30 seconds. I appreciated his quick work and highly recommend him to anyone else trying to solve a similar problem.

This was a cool solution. We were able to keep all of the data in bubble and only needed to leverage the flexibility of running code in their cloud vs Bubbles cloud.

It’s interesting that a ‘bulk call’ to the database times out at 4 minutes but a server side action can only run for 30.

Thanks for the kinds words @elatin and of course you too @lindsay_knowcode for making the connection here!


Hi Jared, I’m struggling with the same problem. I’m loading about 450 records with a single API call and it’s just slow. Can you help me do the same thing? I have no idea how you integrate Firebase to solve this, but it sounds like it works great!