harry7
1
Hey guys,
Im really struggling to wrap my head around how I can ensure an API workflow on a list completes fully before another API workflow on a list starts. Essentially I have a backend action that runs every 24 to update ‘subscriptions’. If any subscriptions fit the criteria the backend action must create a ‘usage’ for each subscription. However to create the ‘usage’ I need to ensure the ‘subscription’s company’ has a current access token for an external API. To do this I made 3 workflows;
- the API workflow that runs every 24 hours
- the API workflow to refresh the access token
- the API workflow to create a ‘usage’
Within workflow 1, I trigger both workflows 2 and 3 as API workflows on a list. I need to ensure workflow 2 finishes running on all list items before workflow 3 runs. This is because workflow 3 needs a current access token to execute some API calls.
Workflow 2:
Workflow 3:
Thanks in advance!
Hi @harry7
Best is to Schedule workflow 3 after workflow 2. It would then be scheduled on each item rather than a list. This will make it dependent on the previous result.
harry7
3
Thanks, Ill try this out.
I would say to try using a database trigger for workflow 3. You cannot guess the number of items in the list of workflow 2 and cannot estimate the time workflow 2 will take to make changes on all items in the list, so using a database trigger rather than a recursive workflow looks like a better option here.
harry7
5
To be honest, I hadn’t thought of this, and it could be a good solution however, there are a few things that I think wouldn’t work:
- How to get the list of ‘subscriptions’ to workflow 3. By the time this workflow is triggered the subscriptions have been updated and can no longer be searched for (delineating fields have been updated so there is no way to tell the subscriptions apart)
- How to change the field in the database at the end of workflow 2 so that the database trigger can be met
Thanks for the reply
ed19
6
This is a really big flaw in bubble’s list api action, I know it’s hot on the lips of a lot of people on the forum, but to my knowledge they are yet to fix it.
The fix for this (and basically every other situation where you want to run api calls on a list), is to use a recursive workflow instead. Scheduling a workflow that triggers itself recursively gives you a lot of control over what happens at the end of each step, just make sure to terminate it correctly as there’s a risk of infinite loops.
- Pass your list of things as a param
- Run your actions on list:first item
- Schedule the same workflow again, but change the list to “from item #2” and add the condition: “only when list:count > 1”
- In your last step you can reverse the conditional logic to do something on your final run: “only when list:count is 1”
Note that as of a couple months back, the schedule on a list action is more workflow unit efficient than recursive workflows, but only if you’re on a new plan. So if you are on a new plan, stick to list actions where you can.
It may also be more wf unit efficient to use an iterator, then search for each individual thing instead of sending the whole list to each call, but it’s not something I’ve had to worry about too much so I haven’t put the time into testing it.
harry7
7
Hi @ed19
Funnily enough I have just started moving all my recursive workflows to workflow on a list to help reduce wf units. The only thing I can find at the moment is to nest the workflows in a custom state, but it seems people have had varying success with this. I think a recursive workflows will be my go to if I cant find another solution.
Really appreciate the detailed response
ed19
8
@vomspace’s idea is a good one I think, though the execution depends on your needs. You could use the list action, then trigger another workflow when “item before change’s api key is not item now’s api key”.
I have a similar use case with the google drive api, so am storing keys in db items and have added an “expires at” date field to each. This means that when I run backend workflows I can search for the latest API key that hasn’t expired, and if empty only then refresh the token.
Thinking about it a bit more and taking the above into consideration, I’m not sure there’s any reason for you to be running 2 separate schedule actions. If you just conditionally grab/refresh your api keys in the same workflow as the one that updates your subs, you won’t run into any issues with timing and won’t need to revert to recursive scheduling.
From workflow 1 → Trigger workflow 2 on a list of subscription. The parameters in this api should contain any details that would also be needed for workflow 3.
Then from workflow 2 → Trigger workflow 3 on the subscription.
This will ensure that the workflow 3 only triggers after the workflow 2 has completed and all the conditions are met.
I also noticed you want to ensure that all the lists run before triggering 3rd Workflow. In that case, use the above solution but also add a field with the list if subscriptions ‘all subscriptions’.
On each workflow, see if all the subscriptions have access token. If they do, trigger workflow 3.
Hope this one works
1 Like
Yes, I don’t exactly understand the workflow, the use case. If I had to update some token, then I would check this when the page is loaded and the current user’s token’s expired, then run this and that to update the token. Or only when this particular token is required.
The Do when the condition is true is also applicable here, however, there are indeed various ways to do things. From a backend perspective, the database trigger seemed the best way out here, as you want to do sth right after sth changes.
harry7
12
Really appreciate this. I’m not sure why I was making it so complicated when I could just do this.
To help others that may be in the same situation, what I did was change workflow 2 to a custom event, which I then triggered from within workflow 3. I added a conditional to the trigger so that it only renews the token when token’s expires at - current date/time:formatted as minutes <=1. This way it is not triggered for every subscription if they have the same company.
Thanks everyone for the help
2 Likes