Any solutions to this problem would be greatly appreciated. You could possibly run the call outside of bubble and Post it via. the connector. But, if many does this, the rate limiting will become a thing again. This is a endless cycle unless the limit roof is raised or the calls are being made from different servers/hosts.
Hello there, same problem for me! I hope bubble.io can fix this soon, my website just went live on friday and it uses the dall-e 3 api so there goes my successful launch
Same issue here, would greatly appreciate a solution, currently having to send my calls through a separate service before reaching OpenAI (DALL-E 3) which slows down the user experience and adds extra cost.
Yeah, Bubble needs to get in touch with OpenAI but doubt that’ll happen at a speed that’s useful. This is an OpenAI limit, not Bubble.
You can either make the API calls client side (which is fine if the API key is provided by the user, but not if your site uses one API key), or set up a request forwarder.
That could be a simple Cloud Function/Lambda function that forwards any request you send it to the correct place and returns the result.
ChatGPT/Claude 3 Opus will write the above for you fairly easily.
Maybe their fault is to rate-limit a single server even if that server makes the calls with different API keys. Different API keys should mean some flexibility for who can call what and how many times. Technically, all legit keys.
So, theoretically, they could hit a rate limit on ChatGPT too, right? Is EVERYTHING on one server?
Imagine the screaming and chaos if that happens. And of course, it would be addressed a lot sooner too. But if there’s potential for Chat to be rate limited, perhaps the fear of what could potentially happen might prompt some action.