Streamlining Automation Processes: Optimizing Execution and Minimizing Operation Count


In this post, I will address a common problem encountered when running automation processes that involve thousands of data points, all of which consume a significant number of operations using the normal process.


When running automation processes with a large dataset/row of data, the number of operations required can quickly become overwhelming. This not only leads to slower execution but also increases the risk of encountering errors or exceeding API limits.

Therefore, finding a solution to minimize the operation count is crucial for efficient and error-free execution.

Solution Approach:

To tackle this problem, we have developed an integration that aims to reduce the number of operations throughout the process.

Our approach involves leveraging various techniques, modules, and API calls to optimize the execution flow and streamline data handling.

Actual Execution:

Our client possesses over 5000 lines of data stored in the ClickUp server. Their requirement is to monitor Time Entries and transfer this information to a Google Sheets document, including User Name, Job ID, start and end time, duration between start and end time, task name, list name, and many more. Following the conventional procedure would involve performing operations equivalent to the number of data lines present in the ClickUp server.

The integration we created consists of multiple components, including modules, API calls, and the implementation of specific formulas.

Add a heading (1)

These components work together to effectively process the large dataset, ensuring efficient and accurate execution. By carefully managing operations, we are able to reduce the overall processing time significantly.

Step 1:- The utilization of a Text Aggregator to consolidate data from various server modules such as ClickUp, Email, HubSpot, Zoho CRM, etc.

Step 2:- Create an aggregated text by summing up values using mappable parameters, such as “value”, while employing various formulas to calculate time differences and start and end times.

The formula provides a benefit by allowing the calculation of time in the HH:mm format using minutes alone.

Step 3:- To address this challenge, we can utilize a module called “Make an API call” within Google Sheets. This module enables us to make specific API calls to the desired endpoint, allowing us to leverage the batch update functionality. To make use of this feature, we need to provide the spreadsheet ID and specify the range of data to be included in the batch update.

Step 4:- We developed a body structure that allows for mapping aggregated values into Google Sheets.

The specific body field encompasses the desired range and value that needs to be populated in the designated sheet, along with the sheet name and range information.

Scenario Image:-


⦁ Improved execution speed of the automation process
⦁ Faster completion
⦁ Increased productivity
⦁ Reduced operation count
⦁ Minimized risk of encountering API limitations
⦁ The smoother flow of data
⦁ Fewer disruptions
⦁ Effective time management
⦁ Efficient allocation of resources


In summary, our solution addresses the challenges faced when running automation processes involving a vast amount of data that consume a high number of operations using the normal process.

By creating an integration that optimizes the execution flow and reduces the operation count, we enable faster execution, smoother data processing, and improved time management. Our solution not only enhances efficiency but also mitigates the risk of encountering errors or API limitations.

With these benefits in mind, our approach provides a valuable solution for optimizing large-scale automation processes.


Hey @GrowwStacks!

It’s true that when we start working with larger data in Make, it’s crucial to be mindful of our operations and learn how to build smart scenarios :nerd_face:

Your tips are really appreciated! Thanks so much for sharing your insights with all of us.

Keep up the great work! :clap:

1 Like