Does the ChatGPT module use the same thread in a scenario or a new one for each module?

I’m trying to cut down on the operations used in a scenario.

Currently I have a huge string of ChatGPT modules that is adding a few messages to get the prompting right, then the last to get the output I actually want (anywhere from a single sentence to multiple paragraphs).

I have two questions.

  1. Does the branch use the same thread in ChatGPT (meaning can it recall the memory from previous inputs) or do I need to reiterate prompts in each ChatGPT module to get the desired output? I hope that made sense.

  2. Can I somehow select multiple outputs from the same ChatGPT module or does it always grab the last message output as data to use? Because even though the module has 3 messages, only the last is available to put in Airtable.

Firstly, there is no ChatGPT in Make. I assume you meant the OpenAI “Create a chat completion” module.

The “Create a chat completion” module does not reuse any threads or store previous messages. That needs to be handled by you using your database of choice.

You can use the Message an assistant module, which will generate a new thread, which you will still need to store and reuse it in another operation.

1 Like

That’s what I feared. So many operations to do back and forth to the database, reiterate data in a new thread…etc. Definitely not ideal.