OpenAI - We could not phase the JSON body of your request

Hi i’m trying to build a API call from openAI to sometime i get an error this error

There was an issue setting up your call.
Raw response for the API
Status code 400
{
“error”: {
“message”: “We could not parse the JSON body of your request. (HINT: This likely means you aren’t using your HTTP library correctly. The OpenAI API expects a JSON payload, but what was sent was not valid JSON. If you have trouble figuring out how to fix this, please send an email to support@openai.com and include any relevant code you’d like help with.)”,
“type”: “invalid_request_error”,
“param”: null,
“code”: null
}
}

my API call is the following

{
“model”: “text-davinci-003”,
“prompt”: “Q: can you explain to me this excel \n =INDEX(A1:D10, MATCH(“maximum”, A1:D10, 0), MATCH(“January”, A1:D1, 0)) \n”,
“temperature”: 0.22,
“max_tokens”: 500,
“top_p”: 1,
“frequency_penalty”: 0,
“presence_penalty”: 0
}

where is bold is the dynamic value, to be notice is that if run the call with an other dynamic value

{
“model”: “text-davinci-003”,
“prompt”: “Q: can you explain to me this excel \n =SUM(A1:A2) \n”,
“temperature”: 0.22,
“max_tokens”: 500,
“top_p”: 1,
“frequency_penalty”: 0,
“presence_penalty”: 0
}

the call works.

any tip?

i suspect that the quote like “maximum” is acausing the problem however, i should like to not get read of it. is there any way i can set the dynamic value as text so it does not get mess up witht the JSON?

When you’re doing the API call action and filling in the parameter values, add :formatted as JSON safe after your input’s value

+1 to this.

I’ve noticed it only happens when I submit a dynamic value, I tried the ‘:formatted as JSON safe’, no joy :no_mouth:

Also, it all worked perfectly fine yesterday, only an issue today :man_shrugging:

The way i solved it was to change automatically the input from “maximium” to /“maximium/”

Nice, for whoever it might help, my issue was slightly different; I was parsing user generated text and sometimes that text would contain special characters, prompting the error. I’ve simply added a ‘find & replace’ function to remove those when they appear & now it’s working

The issue here could be a character that was not escaped. I have noticed that format as JSON safe only works if the request field is actual non formatted

Plugin builder often pre-define the double quote to make is easier for the user to send text, this not requiring then to properly format the data as a string

Try searching for and escaping all non JSON chars, or create your own API call that does not assume the data is text

Hey Tyler,

Where would you add this?

‘:formatted as JSON safe’

Inside the body parameters or a seperate parameter?

For example, this is my api connection.

It works perfectly when there isn’t many characters in the main input, like a 5 minute whisper transription, but when I had someone use a 45min episode that contained a lot of characters, it broke.

Where should I insert :formatted as JSON safe?

Thanks!

Hi @jakepearsoncreator

The :formatted as JSON safe operator is used to escape any characters that would break JSON formatting. It also adds double quotes on each side. So quotes, backslashes, newlines, etc. Anything like those would break your API call.

Sorry, I’m not quite following.

I think it might have to do with not having access to gpt 4.

Ya see what your error is, you should probably have the JSON safe anyways because there’s a good chance a prompt would have line breaks and quotes and stuff…

I’m not really up to scratch.

Do you have a link so something valid I could read please?

Link to what :laughing:

I was just saying check what error OpenAI is returning back to you if you’re having issues. If it says something about parameters missing then it’s most likely the prompt breaking the formatting so their API can’t process the request

Hi Jake have you figured it out yet? I’m experiencing the exact same problem

Thanks in advance

Nope. It comes down to not having access to GPT 4 API and breaking token limit.

I did ask someone else and they said a open ai vector embedding using pinecone could work.

NO idea TF that means yet

FWIW, I got the same error when accidentally omiting a " at the end of "max_tokens
{
“model”: “gpt-4-1106-preview”,
“messages”: [
{
“role”: “user”,
“content”: “”
}
],
“temperature”: ,
"max_tokens: <max_tokens>
}

This was the fix:
{
“model”: “gpt-4-1106-preview”,
“messages”: [
{
“role”: “user”,
“content”: “”
}
],
“temperature”: ,
“max_tokens: <max_tokens>”
}

I was following these instructions: https://www.youtube.com/watch?v=7DYwv4mpWYs