"Operation timed out -- app too busy" when scheduling API workflow

I am retrieving data from the API and then making changes to the database according to these rules:


However, given that there are many data, I get the “Operation timed out – app too busy” message every time I click the button, and no data is saved in the database. The alternative would be to save the data in another database where the is only one cell that contain the list of all API data type data (see picture below), and it works smoothly, but the problem is that I cannot manipulate the API data type this way.
image

So, I basically have no solutions remaining. What do you suggest me to do?

Hi @sensei01,
instead of scheduling API workflow on a list, schedule a normal API workflow and go through the list recursively.
Check out this resource for more info

Hey while we are here what’s are people doing to detect when a recursive workflow has timed out.

Is there a way to know when this happens so you can setup up an auto resume function/workflow?

1 Like

Good morning

Hate to reopen an old post, but I am also facing the same issue. I need to save data from an API call. However, when using recursive, there is not data type for the API call. How would I recursively pass all the API call values?

I’d recommend passing it all in a bulk data push. You’re still limited to a ~20 second timeout but you should be able to get at least 500-1000 records in that timeframe.

Hi! @jared.gibb

would you mind explaining what “bulk data push” is? I’d greatly appreciate it

So, I think I figured it out. I can still use the “Schedule API Call in a List” method, however, I need to call this from another backend workflow. I created a new backend workflow that calls the original workflow backend workflow.

This way the workflow on the page doesn’t time out (it actually completes its own actions in a few seconds). Then all the magic happens behind the scenes.

Page workflow → Schedule backend workflow → Schedule workflow in a list

I do have another question. During testing I have encountered some errors with my API call. For example, the API is rate limited to only being called once per seconds. In the server log, I do see that the result returned an error code. How would I be able to retrieve this code (result)? I would like to add a field to my database like “Last Result” and then record the error/success number. This way I can present to my user if something is wrong?

How long should an api call with list take? I know the API call has about 364 items, but it stopped after 274. This was 15 minutes ago. It still has 90 to go, but nothing is happening? It never went on to the next step in the workflow and according to the log, the last thing that happened was 1 new thing was created. There is no error or anything. It’s as if it just stopped.

Any ideas?

@andreas2 “Schedule workflow on a list” is crap and gives out when it feels like it. Plus they don’t suggest any lists over 100 (I don’t trust it with any list over 20). Switch to either a recursive workflow Recursive Scheduled Workflows - Bubble Docs, or if you’re just creating a lot of things use the Bulk Data API to create 1000 things in ~ 30 seconds. Data API - Bubble Docs.

1 Like

Thanks for sharing this here! Didn’t have time to track references down today.

1 Like

Thanks for the reference. I am looking into it now, but I am not sending data to Bubble, so not sure how to make this work. I am using a GET request to retrieve data, then saving it to Bubble. Any reason you don’t trust the workflow on a list?

Well in the Comparing scheduling on a list vs. scheduling recursively section of link I sent, even Bubble says < 50-100 items (why is there a range for a max #??) for “Schedule on a list”, otherwise it will not be reliable.

So pretty much right there it’s a good idea to not use it if you are doing 364 items :laughing:

My problem with the “Schedule on a list” is you set a static x seconds between each workflow, so if you set it for 2 seconds, and your workflow takes longer than 2 seconds to finish then your requests just start to pile up on your app, resulting in what you’re seeing where it doesn’t finish everything because your capacity maxes out.

All valid reasons :slight_smile:
So, I’ve created a bulk api call but running into some issues
First off, no things are being created. The workflow seems to finish and indicated that it ran, so I was hoping to see how I could troubleshoot?
My second question is in regards to passing objects. For sake of testing, I am only using text fields, but ultimately I need to store the user object (i.e. who initiated the api call). Is this possible?

I’ve pasted the end of the log. It shows the POST action, but then nothing happens. Any idea what to look for here?

Thanks!

Is is good practice to do :formatted as JSON safe after each dynamic input (remove the quotes on each side of the dynamic parts, it adds it in already)


Also don’t do any spaces in between anything. Then a line break as the delimiter (I think you have that already)

The “Created By” field unfortunately can’t be modified… the best I’m thinking of is creating a User field and setting that field to the Current user’s unique id… There are ways to run these API calls in the context of the Current user but I think it get’s more complicated using OAuth, not too sure about that one.

So you are able to use the Data API? It sounded like you need to do an API call for each item or something?

Thanks! when I initialize the API call I get errors

I removed all spaces but it’s giving me a “Could not parse as JSON” error. This is how I have it configured. Did I do something wrong here?

Ah, I found at least the issue here. One of my quote (") was not the right kind. I changed it and now at least I can initialize successfully.

That was it. One of the quotes. Ugh…
Is it possible to pass a user object? I don’t want to actually change Created By field, but I do have a reference in the data type to the user.

1 Like

Yes if your thing has a User field you can just pass it the Current user’s unique ID and it will match the actual User

1 Like