Bubble - OpenAI

Is there a way to make text display as generated by OpenAI on bubble rather than as a block?

To further explain this, on ChatGPT, once you send a message, as the response is generated, it is sent, rather than sending the entire response once fully generated. Is there a way to replicate this experience on bubble? With this, the load times appear shorter.

As in streaming the response? Not as far as I am aware without some complex workaround. The API receives the response in one go, not a series of streams. This is for the completions API. What API are you using?

Yes, to stream the response, basically applying a typewriter type style. I’m using the OpenAI API (“Create Completion”) via the Bubble API Connector.

You can do this pretty easily but you’ll need a plug-in. There are a good number of plugins available all with different pros and cons.

Thanks @georgecollier - I’ve tried using some of the typewriter plugins, however, they seem to work for static set of text. Curious if you have suggestions on a plugin that would work for dynamic text output generated via OpenAI?

Yes, the typewriter plugins only work with already provided text.

To stream the response (text streaming is the search term/keyword you’re looking for), you cannot use the API Connector. The API connector only returns one output, the entire text. There’s no way to make that happen faster in native Bubble. A different kind of connection is required (websocket) to get the text live as it generates.

Search the plug-in marketplace for AI Proxy, ChatGPT, Streaming and you should find something that works!

Got it, thanks @georgecollier - I’ll give some of those a shot and see if it works for me.

This topic was automatically closed after 70 days. New replies are no longer allowed.