Api connector: Open AI : "message": "you must provide a model parameter",

Hello! I am using the API Connector Plugin. But I keep getting the error that I dont have the model defined.

Error Returned…
There was an issue setting up your call.

Raw response for the API
Status code 400
{
“error”: {
“message”: “you must provide a model parameter”,
“type”: “invalid_request_error”,
“param”: null,
“code”: null
}
}

Headers
Content-Type: application/json
Authorization type : Bearer ######

Body
{
“model”:“gpt-3.5-turbo”,
“messages”: [
{
“role”: “user”,
“content”:
}
],
“temperature”: ,
“max_tokens”: <max_tokens>,

}

Any ideas where I have a mistake?
Thanks!
John

Define a model then?

I thought I did here…
“model”:“gpt-3.5-turbo”,

It looks like you’re using curly quotes ” ” instead of straight quotes " " which is causing the JSON to be badly formatted.

Replace all ” with "

Thank you for replying with that tip. I did replace the curly quotes with the straight quotes. But when I go to initialize the call I get the same error. I have also pasted the screenshot of what I have entered.

I am following along with a you tube video, doing exactly the same thing (at least I think so:)

{
“model”:“gpt-3.5-turbo”,
“messages”: [
{
“role”: “user”,
“content”:
}
],
“temperature”: ,
“max_tokens”: <max_tokens>,

}

Did you specify the dynamic parameters before you initialised? e.g prompt should be “Test prompt” and max tokens could be 2048

good thinking, thanks again…so tried it … but still no luck…

error still appears…

Screenshot 2023-09-12 at 8.40.01 AM

Can you paste the raw JSON body in a code block here because the forum formats it funny which makes it hard to debug :slight_smile:

1 Like
{
	"model": "gpt-3.5-turbo",
	"messages": [
		{
			"role": "user",
			"content": <prompt>
		}
	],
			"temperature": <temperature>,
			"max_tokens": <max_tokens>,
                
}

Haha I’ve just noticed

You have a comma after the last item in the list of keys. Remove the comma after <max_tokens> and you should be good to go.

I appreciate your help :+1:but…Tried it again but still get error. :tired_face:
Looking at this backwards and forwards.

the new code…

{
	"model": "gpt-3.5-turbo",
	"messages": [
		{
			"role": "user",
			"content": <prompt>
		}
	],
			"temperature": <temperature>,
			"max_tokens": <max_tokens>
                
}

the error again…


There was an issue setting up your call.

Raw response for the API 
Status code 400
{
    "error": {
        "message": "you must provide a model parameter",
        "type": "invalid_request_error",
        "param": null,
        "code": null
    }
}
{
	"model": "gpt-3.5-turbo",
	"messages": [
		{
			"role": "user",
			"content": "tell me a joke"
		}
	],
			"temperature": 1,
			"max_tokens": 1000
                
}

This works fine for me. Try copying and pasting that and initialising it without any dynamic parameters.

If that fails, are you sure your OpenAI account has access to the GPT-3.5-turbo model?

You can see in the Usage area of the OpenAI Playground:

1 Like

Can you show what you are sending to API? In the first and second screenshot, your JSON is not correct.
Also, be sure your content-type header is correctly formated. In content, are you adding " around it in value? Or are you using :json formatted safe? (But in API Connector, you need to be sure value is correctly encoded and have " " around it)

Oh wow. i just figured it out, my bad…, I must apologize since I dont know where this originated (me or copy/paste?).

…at any rate, the problem was that I had… http vs https in…

ugh!!

Thank you for your help, I actually really learned a lot and appreciate this community!

https://api.openai.com/v1/chat/completions

now it works!

Easy to miss the forest for the trees hahaha

1 Like

Yes, crazy… i feel like i looked at every pixel! lol