Dynamic Value/Variable to the OpenAI prompts

Hei Bubblers,

I am filling stuck a little bit with API. Need your support.

Currently I am working with OPEN AI API.
Functionality: User send to OPEN AI modified prompt and receive a response which will be stored into a database. It means that input always have one prompt but with different variables in it. The variables/values are set on a front-end by user. After it will be sent to the API and it should be generated a response.

Prompt simple example:
Write for me a text [number_of_words] words and [number_of_paragraphs] etc.

My approach:

  1. I create an assistants and threads.
  2. After, I created a Message with POST.

I assume that I need to create a message to send with POST which contain my prompt and then I will GET request to receive generated response value.
Key questions how to set/insert and how to pass (and where in the bubbles interface) values to the prompt?

My first idea to send my prompt and variable inside the body JSON. But I did not find any information about how can I insert dynamic FRONT-end value to the body JSON.

Can you help me with that?

Hey mate,

So you need to uncheck the “private” checkbox, and then you should be able to access the variable from the front-end

Connect with me:

Yes,your were right thank you.

Hi @adam30, following up on the topic and the brilliant video by @MattN, do you know a way to populate a system prompt while still keeping it private ? I was thinking backend workflow maybe ?
Thanks for the help.

Yeah, you would need to call the LLM from within a backend workflow and pass in your system prompt there.

Basically, any API parameter is public if:

  • You store it in the API connector setup and DO NOT tick private
  • You populate the system prompt dynamically via front end
1 Like

Hey @MattN, thank you for you answer. Just tried it. It works but not with a streaming response as the format returned by the backend workflow is type text and not text stream. I’m gonna try calling it from the API connector and ask for “stream” again. Let’s see…

Have you tried changing the option to streaming on the api connector?

Yes, I have a type “stream” both on my API connector and the “return data from API” custom type in the action. But it returns nothing. I just wonder if I’m using correctly the “return data from API” custom type field…