I’m implementing a chat that uses the streaming API feature.
My problem is that when the API is called, the loading bar at the top appears until the complete response is returned. Until I get the full response, the user can’t perform any other actions, such as navigating the platform or stopping the response. This is problematic because the chat may take some time to respond.
Also, the fact that everything is managed on the frontend seems problematic to me; it doesn’t account for users losing internet connection, closing the browser, or having two chats open at the same time.
When I try to use the streaming API on the backend, it doesn’t really behave like streaming; it waits for the complete response.
Are these behaviours expected, or am I doing something wrong?
They seem significant to me; I can’t release my chat to production unless it works 100% and is reliable.
I appreciate any help.
Best,
Bubble’s built-in API connector doesn’t support true streaming on the backend.
What you’re seeing is expected behavior:
• Frontend calls = Bubble blocks the UI until the full response returns
• Backend calls = Bubble waits for the entire payload before continuing the workflow
So Bubble treats “streaming” as a normal API call , it doesn’t stream chunk-by-chunk like a native WebSocket or Node environment.
That’s why:
The reliable way to implement real-time chat in Bubble:
1. Use a server-side proxy (Node / Cloudflare Worker / Supabase Edge Function)
This handles true streaming and sends partial tokens as they arrive.
2. From Bubble, subscribe to the updates via:
3. The proxy pushes token updates → Bubble receives them → renders them live
This removes:
Bubble alone can’t do perfect streaming right now, you need a small proxy layer to make it production-grade.
This is excellent information, thank you very much.
I hope Bubble implements some of these things natively; the streaming API, as it is, is useless for production apps.
1 Like