Issue with Open AI module : maximum context length is to low

Hello evryone,

I’m having an issue between a Gmail module which retrive one mail at a time and the Open AI module i use to analyse this mail. If the mail is very long, which can happen when its written in full HTML, i’ve got this error message :

[400] This model’s maximum context length is 128000 tokens. However, your messages resulted in 262294 tokens. Please reduce the length of the messages.

Origin

OpenAI (ChatGPT, Whisper, DALL-E)

WHat can i do to :

  • allow more tokens ?
  • retrieve HTML from the mail and keep only text ?
  • have a way to synthise the mail before sending it to OpenAI ?
    If one of you have an idea or have already encounter this probleme, it would be great to sahre.
    thanks a lot !