OpenAI Module Error 429: Request too large on tokens per min TPM

Hii i am having similar problem as Arruga.

Here is my Error,

“The operation failed with an error. [429] Request too large for gpt-3.5-turbo in organization org-4nc2qvbIzR6DZxrp7vyvB6l7 on tokens per min (TPM): Limit 60000, Requested 133210. The input or output tokens must be reduced in order to run successfully.”

Also the screen shot https://community.make.com/t/openai-module-error-429/30194/4?u=jordan7 is my credit, my spend li


mit and my prompt.

You are using 133210 tokens, and gpt-3-turbo only accepts 60000 tokens.

Try using gpt-4-turbo-preview model, which accepts a larger maximum prompt size.

You will also need to increase the max tokens field to 300000.

2 Likes

Ok i understand that … another question is how did i used 133210 tokens?

  1. My prompt is short as you can see on the screen shot.
  2. The file that chatgpt will be accessing is only 315 tokens ( i counted )

what would be other reason that I used this much token ?

Maybe the module or scenario is called too many times in the same minute? (repeater or lots of bundles?)

Perhaps you have other OpenAI modules using the same model in other scenarios?

Maybe the content of Data, when converted to tokens, uses a lot of tokens?

Maybe because there are spelling mistakes, OpenAI doesn’t recognise the words and thus uses up a lot of tokens?

2 Likes

got it! thanks for your time !

No problem, glad I could help!

1. If you have a new question in the future, please start a new thread. This makes it easier for others with the same problem to search for the answers to specific questions, and you are more likely to receive help since newer questions are monitored closely.

2. The Make Community guidelines encourages users to try to mark helpful replies as solutions to help keep the Community organized.

This marks the topic as solved, so that:

others can save time when catching up with the latest activity here, and

  • allows others to quickly jump to the solution if they come across the same problem

To do this, simply click the checkbox at the bottom of the post that answers your question:
Screenshot_2023-10-04_161049

3. Don’t forget to like and bookmark this topic so you can get back to it easily in future!

2 Likes