How to Poll OpenAI in Bubble's Backend

Hey Everyone,

So I am sure everyone is aware that their is no easy way to stream OpenAI responses currently. So what we currently have to do is create a Run in OpenAI, and then poll the endpoint until the response is marked as “completed” or “failed”.

I’ve filmed a video on how you can set this up using as little workflows as possible, and also how you can get the responses from your assistants back in a fixed JSON structure that you can actually work with in Bubble.

I also released a plugin that handles the polling for you if anyone wants to use it, but hoping this video helps you set-up your backend polling process in a more scalable way at the very least.

Good Luck!

Plugin Link: OpenAI Poller | APG Software Solutions

1 Like

@adam30 Thanks for the video and plugin, I’ve been carefully dissecting each piece of it.

Are you manually setting the API response for your bubble workflow so it matches the Json Scheme of the assistant?

Bubble announced streaming to be released very soon. It is in Beta testing mode now. Hopefully it will be released soon for everyone to use. :blush: