OpenAI Max_token value doesn't seem to constraint output

I read this: Can't make chatgpt produce a text with limited characters number

Which said that by setting the max_token value in the OpenAI module, I could constrain the completion size.

However, when I set to 300…I still often get completion tokens >1200…sometimes more. As if it is ignoring this.

Is there a way to validate whether the Max_Token value is being used properly by Make?

For quicker assistance with bugs and technical problems like this, you may want to contact support directly. They respond very quickly and update you frequently on the status of their investigation.

Hope you can share the resolution with us if you manage to solve this problem!

1 Like