OpenAI error with Recursive WF: Could not connect to remote server

Hello comrades

Has anyone encountered this error ?

I have an API workflow that calls OpenAI. It’s a recursive workflow and it works for some time until it stops with this error message. Don’t actually know if it’s from bubble or OpenAI.

The prompts can get quite big and I space these workflows by 1 second only. I’m guessing I’m giving too much information too fast to OpenAI.

Is there at least a way to catch this error? So that I can create a retry system.

I’m thinking of moving this entire workflow out of Bubble if I cannot make it work efficiently. If anyone has done anything similar, please let me know :slight_smile:

@detrazvictor check your rate limits with Open AI and how many tokens your likely to be using in a minute.

E…g if your rate limit is 20K tokens per minute and you use 5K in 10 seconds, you still might get rate limited.

Solution: - increase your rate limit or space your workflows

@detrazvictor

https://platform.openai.com/docs/guides/rate-limits?context=tier-free

Hey @DjackLowCode @cmarchan

Thanks for the pointers !
I’ll have to calculate how many tokens I am sending indeed.

What’s really disappointing is that there’s apparently no way to catch this error. It just occurs on the workflow level and the OpenAI step is never returning an error (I monitor the "returned_an_error’ info for every workflow run)

I managed to create a retry system yesterday by monitoring if the index is still the same after about 45 seconds. If it is the same and the workflow is not “done” (which is an information I save in the DB when the index is the “last index” i.e the “:count” of my the list I apply the WF to), then I re-schedule it with the necessary data

And it works for my current intended purpose. But I’ll see if increasing the token limits solve this

Many thanks either way !

1 Like

@detrazvictor yup had the same problem! Let us know if you find a way to catch the error!

This topic was automatically closed after 70 days. New replies are no longer allowed.