Hi - has anyone had a chance to use the new Chat GTP API in their bubble app yet? If so, grateful for any tips on how to get it going. I’m using the GPT3 Model Davinci at the moment in my app. It’s performed great to date but would like to migrate over to Chat GPT 3.5 turbo as soon as possible.
What is the problem? Works like a text completion, just another api request,mm
Thanks Eugene - i’ve tried running the request but it’s not working for some reason. Was hoping to get some screen grabs from some who has been able to get it going using the API connector.
See my API Connector in the plugin tab and jump to GTP-3.5 call (gpt-3.5)
Editor link: Bubble-solutions | Bubble Editor
Dont forget to set your API key : Bearer sk-JqKBREA8b4 …
Video: Bubble.io - OpenAI GPT-3.5-turbo & GPT-3.5-turbo-0301 API Tutorial - YouTube
New endpoint: https://api.openai.com/v1/chat/completions
When saving to custom state: Body Choice First Item’s Message Content
They added messages so it is no longer choice text
Amazing thanks! Have you managed to get the stream function working via Bubble?
I haven’t had a chance to look at that, yet. Sorry
Not yet, but sounds interesting. I’ll see later today
Anyone else having trouble adding previous chats to the ChatGPT API call?
I’m pasting in the previous answers as part of the array of responses and it works fine unless the previous AI response includes anything like quotation marks etc when I get this error:
This is because of how JSON works. An unescaped quotes within the call will break it.
Use find and replace to escape the quotes.
Here’s how I did it for user inputs sent to the OpenAI completion endpoint…
Amazing, thanks!
Have you managed to get the stream function working via Bubble?
Not yet.
Was able to initialize and receive the streamed data in the API Connector. Didn’t receive the complete data when called from a workflow.
So will leave it for another day.
Not sure if this helps: How to stream response in javascript? - #4 by asabet - General API discussion - OpenAI API Community Forum
Has anyone worked out how to feed in the message history (prompt and response) so they model can reference things from earlier in the conversation?
I’m not sure how to set this up in the API connector. Seems like you want to store the history to a variable and then feed it back into the POST request each time you ask a question but not sure how this would work.
For anyone who stumbles on this. Here is a very inelegant solution which can certainly be tidied up but has the required functionality
I am a bit lazy to type it up but hopefully the screenshots should help but if there are any questions just ping the thread.
Emulating @stevenrichardlevy spirit of sharing I would also like to share this comment on another AI thread in the forum that is about building expanded AI functionality into a Bubble app
This topic was automatically closed after 70 days. New replies are no longer allowed.