Hi everyone,
I’m experiencing a streaming issue with my AI chat implementations, and I’m trying to understand what changed.
The problem is the following:
I’m using the standard Bubble streaming flow with the API Connector:
-
OpenAI call with streaming enabled
-
data.delta→ Push to Text Stream
-
Display data in a group (API Call’s Streamed Text)
-
Inside that group, a text element displays “text so far”
This setup worked correctly before.
Now what happens:
-
The API call runs normally.
-
The “Display data” step executes.
-
In the debugger, I can see the text arriving.
-
However, nothing is visible in the UI, the streaming text does not appear.
-
When I later try to access “Streamed Text Full Text” (for example to save the final message), the workflow hangs and does not complete.
What still works:
If I bypass the streamed object and instead save the final full response chunk separately (the complete message returned at the end, data.text), that works. But then there is no streaming, the message appears all at once.
So in short:
-
The display step runs, but I can only see the message in the debugger.
-
Using “Streamed Text Full Text” blocks the workflow.
-
Saving the final non-streamed full response works.
I haven’t modified these workflows recently. About a month ago, everything worked as expected across all my AI chats. Since then, I haven’t followed Bubble or OpenAI-related updates closely, and I didn’t change the logic or API configuration.
I’m trying to understand whether:
-
There was a change in how Bubble handles streaming responses,
-
Something changed in OpenAI’s streaming structure,
-
There’s a known issue with “Push to Text Stream” or the streamed object behavior,
-
Or if I should be looking for a specific configuration issue on my side.
Any insight would be greatly appreciated.