To be able to bulk update rows in specific columns without overwriting previous data. I need to be able to bulk update 50 rows at a time, in column A, B, and C. The reason is, I’m scraping the website of the particular lead to generate a summary I can then use to create an icebreaker message for my cold email campaign.
What is the problem & what have you tried?
It works fine adding values to the new fields/columns when I run the scenario the first time. When I run it a second time, regardless of if I’m using a bulk add or bulk update rows module, it starts from the beginning and overwrites the previous data. I’ve tried filtering too
I used the update bulk rows module because the other one had its limitations for this use-case (I honestly can’t remember what it was exactly…)
I created two Set variable modules that calculated/defined the next Start row and End row and I retrieved the necessary data to decide this from a Get range Google Sheets module (total number of bundles for the Start row variable and the length of the array/_IMTAGGLENGTH_+2 for the End row variable).
So, if I already had 45 rows populated with data, the Get range module would retrieve that number and I’d output 46 from the Start row/set variable. I’d then count the item number from the array aggregator and add that to the Start row set variable output (in this case 46+something=End row).
The reason I did all this is because the Bulk update module need the exact number of rows which is a pain in the ass but it is what it is.
I also added an Update rows module just after the Firecrawl module/scraper that added YES to a column in the first sheet, called “Processed”.
Finally I added a filter for the trigger/Search rows module that checked for “Processed” Does not exist, which means that the search rows module will only search rows that haven’t been Processed (contain “YES” inside the Processed column).
Ohh, I almost forgot I added error handlers too for the AIs and scrapers (Resume module). I chose these because they gave me the option to still populate fields in the row, even though the Firecrawl scraper couldn’t scrape/find the website url or the AI messed up somehow. That way we didn’t break the flow and start mixing the names and rows etc.