My idea is to feed the iterator with multiple URLs and pass it to the RSS to extract at least 3 distinct blogs/articles/news items from each URL and pass it to the OpenAI module to read and analyze and give a well-structured summary in the form of a email. You can all it an ‘AI news bulletin’ via daily Email.
The attached setup came closest to the output I am expecting. However, it is not a quality output. I am feeding 5 URLs (5 websites that have AI related updates) to the iterator and passing its output to the RSS to retrieve feed items. After this, there are two approaches:
Approach 1: RSS—>Array Aggregator—>Open AI—>Send an Gmail
This approach is successfully populating the RSS output with 15 bundles (3 from each of the 5 URLs) with the right fields. However, the array aggregator after the RSS is aggregating al RSS endpoints leaving no individual article data. OpenAI module requires parsed items (title, description, date), which are getting aggregated before they reach the OpenAI module. Hence the OpenAI module produces only the basic email body without content.
Approach 2: RSS—>Open AI—>Send an Gmail
Connecting the RSS (retrieve feed items) ensures that all bundles are passe to the OpenAI module, but in 5 separate cycles. Due to this, the RSS picks only one URL at a time, retrieves 3 articles from the URL and passes it to OpenAI and eventually generates one email with partial information in one cycle. Due to 5 URLs, I am getting 5 emails that are bad quality instead of one good quality email.
I have tried so many versions of this using different modules and configurations and am feeling helpless since I am new to Make. I have tried using a router to have parallel RSS modules, consecutive RSS modules, text aggregator, and even passing all URLs through the iterator as a single array instead of individual items . It should be simple, but I have reached a complete mental block.
I could use some help to get one quality email per day using RSS and Open-AI capabilities in this setup.





