How would I hide the prompts in an OpenAI api call?
When I right click inspect I can see my prompt. I don’t care about hiding what the user submits as it isn’t sensitive.
I tried calling from the backend and it doesn’t work. When I inspect other bubble apps I’m unable to find their prompts so it leads me to believe this is possible. Just not sure how.
Also, is it possible to call OpenAI and return a response from the backend?
Move your request to a backend workflow, store the return into a dataset.
Thanks, Doug. This is what I assumed at first and tried.
I’ll retry. Have you been able to store the return from OpenAI from a backend workflow?
I’m not sure why it wouldn’t work.
I do this with OpenAI. Are you having any specific errors that you can share so we can see how we can help?
Let us know!
Hi @doug.burden & @J805,
Thanks so much for the quick reply. I’m using the whisper API to transcribe text and then a GPT-3.5 call once the transcript is returned.
This is the current workflow (pretty messy and I think I’m just messing it up more). Previously, I set the state of the page and added a pause in between the front end workflows.
Could something be wrong with the backend workflow? It is transcribing the audio fine.
If that’s confusing I’ll re-explain.
Thank you soooo much!
I don’t see you passing a note to the backend. That might be a reason. How would you know where to save it unless you pass it back as a parameter.
Does that help?
Thank you so much! Time for a break… Can’t believe it was that obvious!
Sorry everyone and thank you for putting up with me.
No worries. Happy to help.