Hi all. I have an issue today where my6 backend workflow endpoint that receives webhooks is suddenly not parsing any of the fields. its magically started populating the raw json body field but all fields from that json create null values when we parse them into a data thing.
Anybody had this issue before? im going to have to switch all of my steps to extract with regex just to get the values from the json this way and i have no idea why it randomly stopped working the normal way this morning.
I don’t know if this is related, but I have API streaming calls that aren’t saving their responses (looks like it’s not able to parse the output json). The stream works fine, but when the stream is done, the stream’s final output fields aren’t getting set.
In my case, it seems i had made a mistake by checking the checkbox “include headers in request data” after initialising my backend workflow/endpoint for the webhook. By enabling this checkbox bubble started providing the json raw body (which usually it doesnt for some reason) but it also stopped parsing the json into values (so adding those values to data thing would produce empty values/fields). So i unchecked the box “include headers” and instantly i lost the raw json output but didnt need it anymore as bubble correctly parsed all the values like it should do.
So im calling this human error on my part unless this is some form of bubble sorcery. Im assuming by checking the box after i had intiialised (noticing it then didnt start including headers in the request data edit/sample) the payload changed so bubble stopped parsing valuesa and just dumped the raw json.
I have exactly the same issue using OpenAI API. I am saving the full text response in my database but I get a “Workflow error - Sorry, we ran into a temporary bug and can’t complete your request. We’ll fix it as soon as we can; please try again in a bit!”
I reported this as a bug to bubble yesterday. I found a workaround, which is not to use the “text stream’s full text”. Instead (and depending on whether the api has this data), use the final output that’s provided at the end. So for the “responses” api, that event is called “output_text.done”, and you want “data.text”. Which contains the full text. When I map that that out I’m not getting errors.
This workaround only works if the api stream actually provides a full output at the end.
Here’s an image of my setup that implements this workaround.
checking with the team to see if are seeing an influx of bug reports around this. If you could all also submit bug reports it will help prioritize/escalate things, thanks!
I’m using both azure openai, and also calls directly openai using the “/responses” endpoint. If you use the regular “/chat/completions” endpoint, you won’t see these events at the end of the stream. But the “/responses” endpoint does show them.
Thank you. This error rendered my client application completely useless for almost an entire day until I figured out this workaround. Exceedingly frustrating.
thanks for suggesting this. not sure whether to invest in this workaround or wait till bubble fixes it. pretty annoying tbh - not how I wanted to spend the afternoon… debugging something out of nowhere