Hi All,
I have configured an API for OpenAI using the Bubble API Connector and it works but I have a large prompt that gives a response “transfer-encoding”: “chunked” even though I have set “stream”: false. I have tried skinnying down the prompt but I can’t get it small enough.
Is there a way to get Open AI to allow you to do multiple calls to get the entire package of chunks? If yes, how? If not, do I have to develop a custom plug in to handle streaming?
Thanks,
KH