OpenAI AND "transfer-encoding": "chunked"

Hi All,

I have configured an API for OpenAI using the Bubble API Connector and it works but I have a large prompt that gives a response “transfer-encoding”: “chunked” even though I have set “stream”: false. I have tried skinnying down the prompt but I can’t get it small enough.

Is there a way to get Open AI to allow you to do multiple calls to get the entire package of chunks? If yes, how? If not, do I have to develop a custom plug in to handle streaming?

Thanks,

KH

The models have a limited context window, there are no solutions to exceed it, some times I add the long prompts in a text file and ask the model to read and answer it!