Synchronous Backend Workflow Execution using Custom Events

I’m trying to get a number of Backend API workflows (on a list) to execute one after the other.

In several other forum posts, it was suggested that if you wrap them in Custom Events, Custom Events are only triggered when the previous Custom Event is complete.

I’ve tried to do that (as in screenshot) but I am still getting instances where amounts are being calculated using values that are not yet ready (from a previous) step - leading to errors.

Am I tackling the custom events workaround wrong? Do custom events only follow the synchronous rule when not computing on the server?

Here’s the “mother” Custom event which should have all of the listed actions (custom events) firing one after the other is complete (in the ideal world - it’s not).

Here’s one of the custom events in the “mother” expanded. You can see it is just a wrapped API Workflow.:
Screenshot 2022-06-03 at 08.07.43

Here’s one of the API workflows expanded that the “child” custom events fires:

2 Likes

I would try instead of triggering them one after another in one workflow, to trigger them at the end of the custom event that lives as a pre-requisite. In this scheme events would only begin when the workflow is complete. In your current scheme, once the “trigger” has completed, it starts the next trigger.

2 Likes

Thanks! I’ll give it a go.

Didn’t work. Order still doesn’t seem to be synchronous.

Hey Simon, after scheduling the API workflow the custom event will finish, so the next custom event will be executed wether or not the scheduled API workflow is finished.

To solve this you can f.i.:

  1. Instead of scheduling an API workflow to perform the actions, directly perform the actions in the custom event workflow.

  2. Create a data type “workflow status” in which you mark a certain workflow as completed. Add an action to the API workflow where you set this workflow as completed. Then either use a database trigger or a recursive workflow to check wether or not that API workflow is finished before you execute the next workflow.

Thanks so much @gerbertdelangen I will give this a go.

1 Like

@gerbertdelangen I’m trying your method of setting a database value when a workflow is complete.

The problem I’m running into is using the “result of step 1” conditional value to set that value. My understanding is that by setting the conditional value to depend on the step prior, the database change action will wait until the API workflow is finished.

But I dont know what that validation should be, since the workflow is running on a list of items. I’ve got a database field that knows how long the array is, and using that currently - but it doesn’t seem to be working.

Any ideas?

The validation logic I’m trying to solve for is…

“Is the step prior to this finished running through the list of elements?”

1 Like

Hey Simon,

The easiest thing to do is probably to add the “make changes to workflow status”-action to the step 1 workflow mix_3_2_cal_num_days. Make sure to only execute the action for the last item in the list.

A quick (but not so resource efficient way probably) way to do this is to add an extra parameter to the workflow, which is the id of the last item in the list:

  1. Instead of scheduling the workflow directly on the list, add a new action to the workflow, type “make changes to a list” where you select the items to be changed:

image

  1. Schedule the API-workflow on the list selected at step 1:

** Note the parameter id_last_log

  1. In the API workflow set the workflow status only when id_last_log = current log id:

This should work, if not, let us know. :grinning:

Tooooooo clever! Giving it a go now. :smiley:

1 Like

Just a warning, if a search takes too long, there will be a time out and the workflow will not be executed. To solve this problem you can use a recursive workflows, check this article from Petter from the ultimate guides on performance and security (very useful books btw.).

1 Like

Thanks @gerbertdelangen . The arrays I’m working with are not big, so it shouldn’t be a problem.

Am I setting up my database trigger event properly? Two are working but this one is not. And I know the field is being changed because when i check the database the value has changed from no → yes


(Sorry for all the questions, and thank you for your help!)

It looks good. Maybe manually change the status record and see wether the workflows/changes are triggered? Don’t forget to check the logs, they often provide enough info to find the reason/bug.

That is what I have been troubleshooting. There seems to be an issue with that event being picked up which i can’t for the life of me work out.

When I check the database after the operation runs, the database value is “yes”, but the subsequent trigger has not fired.

If I then go back and change the value back to “no” in the database manuallly, and then back to “yes”, the trigger does fire.

Is there any known cause of this? The problem is the database trigger event is not firing on change of that field.

Ok, just to make sure :smiley:. This might be the cause, In the settings for the datatype, did you set the field to default = no? If not it will be empty and the trigger might not fire.

I did :slight_smile: I’ve even tried changing the field type for that particular trigger field to a number and setting the change from 1 → 2 but that’s not working either.

After a quick skim it looks like you want to do something like …

Step 1 : Run a workflow on a list
Step 2: Wait until all the workflows triggered by Step 1 have finished and then do some other thing

Is that right?

If so, then that isn’t how it works, sadly.

So you would need to back away from the “schedule on a list” and build a recursive workflow that does this for you.

1 Like

Thanks Nigel. After lots of troubleshooting, I came right. Appreciate the response regardless.

1 Like

What was the solution? I’m trying to get backend workflows to run synchronously too.

1 Like

This topic was automatically closed after 70 days. New replies are no longer allowed.