Hey all, been working on resolving this issue for the past 2 weeks, seems like a critical issue that Bubble hasn’t addressed.
I have an API that retrieves data, where I take that data and upload it to the database in a backend workflow. My goal is to call the API every 30 seconds, scrub the data for unique items that aren’t already in the database, and upload that data.
The issue is that when I run the workflow, duplicate items will arrive in the Bubble database. When I watch the database getting updated in live time, it shows that the duplicates are getting entered in simultaneously. I even made a separate data type as a proxy, where it would update that data with the new unique item, and then the database would upload that new proxy item in the database. However, with this, I am still seeing duplicates added.
Is anyone else running into this issue/ can advise on this?
WF: “triggercapitrigger” triggers the workflow “triggercapi” after 60 seconds
WF: “triggercapi” triggers the workflow “returnclientapi” immediately, and then retriggers “triggercapitrigger” to get the process started over again, recursively.
WF: “returnclientapi” will call the api with the data, and then trigger “uploadclientdataapi”. When it triggers this “uploadclientdataapi”, it sifts through the api data, finds the first unique item, and and maps each respective field with that item.
WF: “uploadclientdataapi” simply adds the info to the database, again "only when the database doesnt have this item in it, "(known from the unique ID that each data entry has from the api).
The problem is that this is all working perfectly, except for every couple entries, it will literally upload the same item twice at the exact same time, as if the workflows are sending two items instead of one.