Need Help Filtering Duplicate Scraped Data in Automation Workflow

Hello! I’m facing an issue with my automation workflow on make.com. Here’s my setup: search airtable → text aggregator → scraper → iterator → create airtable record. The problem is, I want to avoid uploading duplicate scraped data to Airtable. I tried adding a ‘search airtable’ step, but I’m struggling to filter out existing data at the end. Can anyone suggest a solution to ensure that only new scraped data is uploaded to Airtable? Thanks!

I’ve been struggling with this for a long time already, hopefully someone can help :slight_smile:

Hi @Javier_Solans
You can upsert records (update/insert)
Make Airtable upsert documentation

You can search for the record id with the search record module by looking for some unique param

3 Likes

Thank you for the answer. And would it be possible to just insert the new ones and do not insert the ones that its name already exists in the table? Additionally, it would be less records created then less operations runned in make.com

1 Like

Without a unique identifier it’s impossible

But yo can use list records via “make an api call” module and then aggregate the data using filters in the between to remove the existing records and it will cost only 2 operations

For more information regard the list records you can read in the Airtable documentation

2 Likes

Would you be up to have a chat? LINK

1 Like