Thousands of Duplicate Runs - Quota Overruns

hmm, your Video indicates there is a serious bug in your flow, you should check the JSON of the “export scenario” to look for inconsistencies.

Separately,

I run between 30 Million - 100 Million new rows of new data from APIs and web scrappers through make: make and google sheets per month and have been able to optimize it down under 100k total operations per month with a 5-minute update interval, including ETL/data transformation and cleaning.

It’s been a few months since my last operations usage whoopsie, but almost always discover that it’s because I made an incorrect assumption about the data, or rushed through testing the flows. If operations are being miscalculated due to some issue with a specific scenario, that should be straightforward to identify in the logs and submit to support.

My Most Recent Operations Usage Mistake Story.