How to combine outputs from multiple iterations in a cycle before passing them further in the scenario without wrapping up the parsed items?

My idea is to feed the iterator with multiple URLs and pass it to the RSS to extract at least 3 distinct blogs/articles/news items from each URL and pass it to the OpenAI module to read and analyze and give a well-structured summary in the form of a email. You can all it an ‘AI news bulletin’ via daily Email.

The attached setup came closest to the output I am expecting. However, it is not a quality output. I am feeding 5 URLs (5 websites that have AI related updates) to the iterator and passing its output to the RSS to retrieve feed items. After this, there are two approaches:

Approach 1: RSS—>Array Aggregator—>Open AI—>Send an Gmail

This approach is successfully populating the RSS output with 15 bundles (3 from each of the 5 URLs) with the right fields. However, the array aggregator after the RSS is aggregating al RSS endpoints leaving no individual article data. OpenAI module requires parsed items (title, description, date), which are getting aggregated before they reach the OpenAI module. Hence the OpenAI module produces only the basic email body without content.

Approach 2: RSS—>Open AI—>Send an Gmail

Connecting the RSS (retrieve feed items) ensures that all bundles are passe to the OpenAI module, but in 5 separate cycles. Due to this, the RSS picks only one URL at a time, retrieves 3 articles from the URL and passes it to OpenAI and eventually generates one email with partial information in one cycle. Due to 5 URLs, I am getting 5 emails that are bad quality instead of one good quality email.

I have tried so many versions of this using different modules and configurations and am feeling helpless since I am new to Make. I have tried using a router to have parallel RSS modules, consecutive RSS modules, text aggregator, and even passing all URLs through the iterator as a single array instead of individual items . It should be simple, but I have reached a complete mental block.

I could use some help to get one quality email per day using RSS and Open-AI capabilities in this setup.

Hey, I am not totally sure if this is doable in your case but have you tried using a Variable module. You can have the Get Variable module run at the beginning of each iteration. Then use the Set Variable module run when you need to store the variable information. However to make sure that the variables do not overwrite each other make sure that you map the output of the Get Variable module first and then the value you wish to add in the Set Variable module and then finally you can map it in your desired to format later in the automation.

Hey there,

Approach 1 sounds like you are not mapping the array correctly and not feeding all the data to the LLM

Approach 2 you still need the aggregator module to go back to one bundle and send only one email, why did you remove it?

I tried using this and it seems like a right approach, but I am still not getting the output

Can you show some screenshots of the modules and what you have setup at the moment?

1 Like

Change the source module to the iterator, not the RSS one.

I did try that before. If I change the source module to ‘Iterator’, the looping issue is solved and the RSS gives all 5 bundles to the array aggregator at once. But I do not get the desired output.

When Source Module = Iterator:
The Iterator outputs only URLs, not article fields. Hence the aggregator sees 5 bundles (URLs) and tries to aggregate those. The fields I selected (title, description, summary, url, dateCreated) come from the RSS module, NOT the iterator. So the aggregator builds an array with empty objects unless the RSS mapped fields happen to leak into the iteration chain. GPT sees {} repeated → interprets dataset as empty → outputs: “No items provided” → fallback Atlas stub.

Ok great. Do this first then provide more details.