I am filling stuck a little bit with API. Need your support.
Currently I am working with OPEN AI API. Functionality: User send to OPEN AI modified prompt and receive a response which will be stored into a database. It means that input always have one prompt but with different variables in it. The variables/values are set on a front-end by user. After it will be sent to the API and it should be generated a response.
Prompt simple example: Write for me a text [number_of_words] words and [number_of_paragraphs] etc.
My approach:
I create an assistants and threads.
After, I created a Message with POST.
I assume that I need to create a message to send with POST which contain my prompt and then I will GET request to receive generated response value.
Key questions how to set/insert and how to pass (and where in the bubbles interface) values to the prompt?
My first idea to send my prompt and variable inside the body JSON. But I did not find any information about how can I insert dynamic FRONT-end value to the body JSON.
Hi @adam30, following up on the topic and the brilliant video by @MattN, do you know a way to populate a system prompt while still keeping it private ? I was thinking backend workflow maybe ?
Thanks for the help.
Hey @MattN, thank you for you answer. Just tried it. It works but not with a streaming response as the format returned by the backend workflow is type text and not text stream. I’m gonna try calling it from the API connector and ask for “stream” again. Let’s see…
Yes, I have a type “stream” both on my API connector and the “return data from API” custom type in the action. But it returns nothing. I just wonder if I’m using correctly the “return data from API” custom type field…