Can't Get GPT Model to do prompt completion

Hi there,

I’m presently trying to get Chat GPT to do a prompt completion but regardless of what I try I get “[404] This is a chat model and not supported in the v1/completions endpoint. Did you mean to use v1/chat/completions?”

I’m presently trying to have it run gpt-3.5-turbo-0125 Which is the only version I have visible to choose from that isn’t labeled “instruct.”

I’ve verified that on the OpenAi side, everything is correctly set up. I’m a bit pressed for any answers on this.

Any insight is appreciated.

Hey Nick,

Is it possible you’ve set it to create a chat completion first? And then switched to prompt completion?

I’ve noticed on my end, that if you configure it to use chat completion first, then save by clicking OK, and then re-open the settings of the module and switch to prompt completion, you can save it again with the same model. Even if that doesn’t exist:
See my Loom video

I think you should be able to fix this by just reselecting your model, and setting that to gpt-3.5-turbo-instruct in the dropdown.

Hope that helps!

Thomas - Nola Digital

2 Likes

Attempted it, and unfortunately, I got the same issue. I am only selecting the available options in the dropdown menu. I have tried all of them and they return the same 404