In my app I have a page-load workflow that creates a new piece of data. The information used to create this requires some input fields that are hidden that populate figures based on background calculations. At first I noticed that some of the data wasn’t being written when the “thing” is created and I assumed this was because the input fields hadn’t finished calculating before this step in the workflow. I added a ten second pause before this step to see if this would remedy the issue, and it did.
I pushed to live thinking the problem was solved and started tinkering with other things, and then I noticed that the log that had been created reverted back to not displaying the correct information. I checked the database and saw that although I had tested this effectively several times the database showed all 0’s for the fields in question. I could have sworn I had seen it work, so I started testing again.
Here is the data showing correctly on the page. But in the database I saw this:
So first of all, you realize that the dev and live databases are separate, yeah? If you populated dev with the info you rely on, that data is not also in live unless you copy the dev dB to live.
You can copy just the data types you need so you don’t do something bad like overwrite live users that do not exist in dev.
Yes I know there are two databases. You’ll see in the video the data is changed on page refresh. To be clear though the log isn’t erased as it appears in the video, the RG just filters out logs with an offset of 0. But you do see the offset go from being correct back to zero and thus removed from the RG.
Sorry, just offering the most common cause of such questions/issues. Am mobile and can’t properly review the video ATM! Sounds like one of those simple/but-vexing problems…
Hopefully it’s a simple fix yeah. I’ve never seen data get written and then changed like that so I’m thinking it might be a bubble bug I’ve stumbled on. I will likely submit a report if I don’t find a fix tonight. Thanks!
Sorry, but I must ask the dumb question: do you know the data was written to the thing before it was removed or is that a supposition based on observed behavior?
From your description and the video, I’m not clear on what’s being calculated in the page loaded event, what’s being written to the thing, and in what order.
Again, not seeing the Bubble workflows, is it possible something is triggering a second “make changes to a thing” that has a source of zeros after the initial correct update? If so, would workflows running in the wrong sequence explain the behavior you see?
Naturally, I don’t have a clue, but I’m hoping to trigger some thinking and digging on your part. I’ve found Bubble to be very finicky with regard to timing of workflow actions. And I’ve created scenarios where two events (workflows) conflict and produce results suggesting that workflows and workflow actions don’t run in the sequence you might expect.
Keep in mind, Bubble is the strong platform NodeJS. It mean, process is done in parallel most of the time and it’s not synchronized as expected in many results.
What you think it will be one by one is in fact simultaneous.
Workflow_1: action_1 + action_2 + action_3 is treated like this:
action_1
action_2
action_3
If you add a condition on action_3 ’ IF action_2 is not empty ’ then it will be:
This may get off-topic, so I’d suggest searching the forum for discussions of timing. Here’s just one example of the kind of thing I’ve experienced:
JohnMark touches on timing above.
Frequently, people suggesting adding a pause to help with timing, but there is no reliable rule for how long a pause should be for consistent results. Here’s another take on that:
Doing something about timing is clearly an issue for Bubble, according to the roadmap, https://bubble.io/roadmap:
" Now
App performance
Systematically optimize most data-intensive queries
Monitor and reduce JS / JSON size
Apps bug-free at scale
Workflow consistency guarantees"
Perhaps I’m misinterpreting, but “Workflow consistency guarantees” sounds to me like workflows running as expected every time, i.e. timing and sequencing.
What we mean by “Workflow consistency guarantees” is making sure that if two workflows touch the same thing in the database exactly at the same time we have a clear rule about the way the data is impacted. This is a corner case at scale that you’re very likely not hitting (i think I already told you that in another thread).