[New feature] Native support for API streaming

Hey Bubble community,

Rutvij here. I’m the PMM for AI at Bubble and I’m super excited to announce a feature that’s been highly requested: support for API streaming!

Bubble Ambassadors, Insiders, Enterprise users, and Gold agencies have had early access for about a week. @bestbubbledev, one of our Insiders, was nice enough to pull together this demo.

What is streaming?

Streaming API responses allows you to receive real-time data updates from your API endpoints. There are both AI and non-AI use cases.

Specifically for AI use cases, enabling streaming means your users will see responses from services like ChatGPT or Claude word by word, instead of all at once. It’s a game-changer for building real-time features like AI chatbots.

How to get started

  1. Set up an API call to the service of your choice
  2. Set the data type as stream
  3. Make sure to add “stream”: true or the equivalent parameter to your API call (based on the service you are calling — check their docs)
  4. Select the values you want returned in the stream after initializing the API call
  5. Just like regular JSON API calls, streaming APIs can be used both as a data source and within workflows. Make sure you check our documentation at this step, since Bubble behaves slightly differently on the client side versus the server side.

Resources

  • Documentation: Check out our detailed guide on how to set up and stream API responses.
  • Support: If you have any questions or need assistance, our support team is here to help.

What’s next?

We’re just getting started with API streaming! Right now, we support streaming text data. We don’t currently support audio and video streaming. We also don’t support streaming with mobile native yet on BubbleGo.

Start building!

We can’t wait to see how you’ll leverage streaming in your builds. Get ready to create cutting-edge AI apps with a seamless user experience!

Happy building, and let us know how it goes.

— Rutvij

32 Likes

Great! Is it supporting all types of streaming/push? (websocket, SSE, HTTP/2 Push, Over HTTP …)?

5 Likes

FUCK yesss

3 Likes

This is cool.

Does this mean these things are in the works? and if so, any kind of timeframe or insight into which will be next?

3 Likes

Does Bubble’s streaming support currently handle SSE (Server-Sent Events), or is it limited to chunked application/json streams (like OpenAI’s streaming API)?

2 Likes

Thanks!! looks great. How to show the streaming text in multiline input element?!!

Can we get a quick guide spun up (along with a demo app)? I’ve been able to get the API call initialized but have no idea how to actually visualize the streamed text - I’ve tried displaying both the text stream itself and the text so far with no success

1 Like

Same actually - literally cannot figure out how to stream into multiline. Have tried two different LLM providers

1 Like

Same! I’ve got a support ticket going, but I also can’t get any of the text stream data into the design.

My API is connected (with stream enabled) and I receive a successful response when initialized. In my design, I’m able to see “is waiting,” “is done,” and “is streaming” values change in real-time, but the text never comes through (neither “text so far” nor “full text”).

:thinking:

3 Likes

@romanmg & @reggie.s

I’ve got it sorted not…

  1. Workflow

  2. Group

    • Type of content = the API call’s return type.

3Processing: Screenshot 2025-04-22 at 2.20.22 PM.png…
. Live text element

  • Text (or Multiline Input) content →
    Parent group's Text stream's text so far

I believe this is due to Bubble keeping the object “live” and refreshing all of its internal keys (text so far, full text, is streaming, etc.) as chunks arrive

1 Like

I would probably just display it in a textblock and when the streaming is complete, hide that and display a multiline input with the same font size

Thanks for the early access and was very pleased to see this available via Action for Backend Workflows now… :fire::fire::fire: Talk about dropping easter eggs!

Now we just need websocket and/or webRTC and this platform becomes unstoppable with major features being native with the plugin support stack!!!

2 Likes

Love this. Works perfectly!

This feature works well for me. Great work!

If I’m displaying a list of message in a repeating group, I can stream the response from the client side to the last message/end of the group using this new functionality. Once the stream finishes, I save the response to the database then replace the streamed content with the saved data so it displays natively within the repeating group.

It works but does anyone know a better / more robust way to do it?

1 Like

Here’s my makeshift setup that I just tested with:

I have a group with the data type “text stream” on the page. After API call step, I display the result of that step (entire text stream) in that group. Once it’s there, you can easily add a text element within the group and set it to “Parent group’s text stream’s text so far”

Hope this helps.

I’ve made a video that covers how to setup a streaming API here:

11 Likes

how does this work with a repeating group? I.e. in the case of having a back and forth conversation in a chatbox

I tested - perfect!
BUT… I am using for long responses - I had a timeout at 300.000miliseconds… any chance to increase this time ?? Or should I stop before the time out and somehow, make another call sending the first text and ask to continue the response ???
Bu the way, can we stop the streaming??

Websockets next please! Extremely important for multimodal AI (video, audio, etc.)!