I’m trying to upload leads to a spreadsheet while checking for duplicates in a specific column, without using hundreds of credits at a time - is this possible?
-
I have a scenario that scrapes leads from Apify, checks for duplicates in an existing google sheet, uploads the new leads, then uses AI for some further enrichment (we can ignore the last part).
This uses up a lot of credits, especially when passing 100+ leads at a time. I tried setting up aggregators which reduced the credit usage, but since the data is nested in the arrays I am now unable to filter duplicates properly.
Original scenario:
This is the test scenario I’m using to try and reduce operations:
My first problem here is I’m unsure how to map everything so that all items in the array are bulk uploaded to google sheets.
My other problem is the filter only checks for the first output in the array, which then stops anything pulling through:
There are duplicates on the sheet so this first operation would be correct, however it’s not checking through every lead.
Is what I’m trying to achieve possible, or am I going to have to suck it up deal with the credit usage?
Many thanks


