Rss watch stopped working

Hi, I’ve created a scenario with the RSS Watch (Acid) function to distribute the content of this feed Absolutegamer magazine to my social media channels (it’s a valid feed, checked with https://rssviewer.app/).

It was always posting the last item from the feed when it was detected, and everything was working fine (the scenario schedule is 15 mins).

Today, it started glitching, retrieving the wrong item feeds (older ones) and ultimately not detecting new items in the RSS at all. Then, it resolved itself and started functioning as intended again.

The occurrence seems to be random. I checked the forums, and there doesn’t seem to be another problem like this in the community.

Any hint about how to prevent this?

Welcome to the Make community!

1. Screenshots of module fields and filters

Please share screenshots of relevant module fields and filters in question? It would really help other community members to see what you’re looking at.

You can upload images here using the Upload icon in the text editor:
Screenshot_2023-10-07_111039

2. Scenario blueprint

Please export the scenario blueprint file to allow others to view the mappings and settings. At the bottom of the scenario editor, you can click on the three dots to find the Export Blueprint menu item.

Screenshot_2023-08-24_230826
(Note: Exporting your scenario will not include private information or keys to your connections)

Uploading it here will look like this:

blueprint.json (12.3 KB)

Following these steps will allow others to assist you here. Thanks!

2 Likes

Of course. Here it is. I noticed that the “choose where to start” always go on “from now on” even when I select other options like “select the first rss feed item”. Is it normal?

blueprint.json (101.7 KB)



I’ve found a workaround, at least for the duplicate entries (it might be a bug).

I’m using a datastore to save the published URLs and a data check to verify if the entry already exists: if the URL is present in the record, the flow stops, preventing the publishing of duplicates. Otherwise, the scenario commits as usual (the option “overwrite in the ‘data store: add/replace record’” must be disabled).

This leads to another problem, though: the datastore grows quite fast, and there’s no need to store all the URLs posted. For our purposes, we need only the last ones according to the RSS output (let’s say we need to check only the last 20 URLs in the RSS).
Is there a way to limit the dataset size to a set number of records (in this case, 20 entries), deleting the older ones when a new one is added?