Hello everyone, I would like to raise a debate about the “Do every x Seconds” function and also about performance.
I’m using an external API that doesn’t have Webhooks to receive notifications about changes that occur in it, to simplify it is an API of an E-Commerce platform and I would like to receive information from Orders that have been in “Paid” status, such as do not have webhooks I thought about the possibility of using this function “Do every x seconds” to fetch orders only if the status is “paid” but I ran into the issue of scalability and performance, does it make sense to do this? If I put it to fetch every 20/30 seconds, this would only happen if the browser was open, right? Is there something similar I can do in the backend workflow?
Thanks a lot for any solutions!
You would definitely only want this to be a backend function.
You can either set up a recurring or a recursive workflow in the backend to do the API call to check statuses from platforms that do not have webhooks (most do though, and don’t be afraid to reach out to their dev teams either, you’d be surprised at what you can get help with )
1 Like
Thanks for the answer. Unfortunately, this platform in question does not have webhooks yet, they even came up with the idea of creating a function that would search for orders from time to time to see paid orders, unfortunately they will still add the webhooks. But maybe it makes sense to create an API in the backend workflow to fetch orders and query the e-commerce platform API every X time. I’m going to rack my brain thinking about this solution but maybe that’s what should be done! Thanks!!!
@machadoa953
I know this is a very old topic, but I’d love if you can share any solution you came up with.
Do not use do every X seconds for any data related interactions unless you know what you are doing
2 Likes