ChatGPT API call adding random words/characters before output

Hi everyone,

Like most people in the community, I’ve loved how easy it is to integrate chatGPT in to bubble. However, recently I’ve been getting three random words or characters before my desired output and was wondering if anyone else was having the same problem?

My API call is rather simple using the json below -

{
  "model": "text-davinci-003",
  "prompt": "Write a small blog post based on the descriptions listed below\n\nThe best places to travel in New Zealand\nCommon slang words used in New Zealand\nCurrency used in New Zealand",
  "temperature": 0.7,
  "max_tokens": 256,
  "top_p": 1,
  "frequency_penalty": 1.5,
  "presence_penalty": 0
}

When I test my prompt in the OpenAI playground it works perfectly each time but when I make the API call in bubble it periodically throws random words and characters at the start of the output before serving up the correct content.

Has anyone else come across the same problem? Would love to know how you solved it.

Cheers!

Hi Frazer, is it possible that in the OpenAI playground you are using a lower temperature setting?

It might be that the setting 0.7 is too high in your case (although it is the recommended setting for creative tasks/texts like blog posts). For more info, check this article.

2 Likes

Thanks @gerbertdelangen - yeah I played around with the temperature and it still came back with the random words sporadically. I ended up copying the prompt again from ChatGPT and so far it looks to be working fine. I’m assuming there was some formatting in my original prompt which was making it output oddly. Thanks for the help!

1 Like

This topic was automatically closed after 70 days. New replies are no longer allowed.