Best way to chain together OpenAI prompts?

I have a list of prompts that I want to chain together, so that the response of one prompt is used in the next prompt, similar to the screenshot below.

Essentially carrying on a conversation thread.

What is the best way to set this up that DOESN’T involve just chaining the modules together (like in my screenshot)?

Main reason is that I want to make it dynamic, so that I could have a 3 prompt sequence or a 10 prompt sequence all run in the same scenario.

Thanks

Welcome to the Make community!

Delete all the OpenAI modules and use the Message an Assistant module, where you can use the same thread id.

This way you only need a single AI module.

samliewrequest private consultation

Join the unofficial Make Discord server to chat with us!

2 Likes

Thanks, I haven’t used that OpenAI action yet. The only thing is, I use different LLMs that don’t all have pre-built modules that let you keep the same thread id. So I was curious what would be a module-agnostic way to accomplish this…would I have to set up some sort of temp data store to store all the prompts and responses, then aggregate them all together at the end?