Optimizing Webhook Data Handling: Aggregating Incoming Data for Efficient Google Sheets Updates

Hi everyone,

I’m working on a scenario where webhook data arrives every second or more frequently, and I need to aggregate this data to prevent API throttling while sending bulk updates to Google Sheets. My current challenge is figuring out the best approach for temporary storage and batch processing, considering the limitations of certain tools and services.

Here’s what I’m dealing with:
• Frequency of Webhooks: Data is incoming every second or faster.
• Final Destination: The aggregated data needs to be stored in Google Sheets for further analysis.
• Main Challenge: Avoid hitting Google Sheets’ API limits by sending bulk data updates (e.g., every two hours) instead of processing individual webhook requests.

Options I’m Exploring

1.	Temporary Storage:
•	Built-in Data Stores: While Make.com offers data storage, it doesn’t support bulk retrieval of the entire dataset in one go. This makes it challenging to retrieve and process all stored data at once for bulk updates.
•	BAD OPTION: Google Sheets as Temporary Storage: Using a separate tab or sheet for temporary storage could work, but frequent writes might still trigger API throttling.
•	External Databases ?? Pricing…
2.	Separate Scenarios for Aggregation and Updates:
•	I’m thinking about splitting the process into two distinct scenarios:
•	One scenario to handle and store incoming webhook data temporarily.
•	Another scenario to trigger every two hours (or based on a time interval) to retrieve and send aggregated data to Google Sheets in bulk (set values/add rows as array). 
3.	Custom Scripting:
•	If the above options aren’t feasible, I might consider using custom scripting outside Make.com to manage the aggregation and batching process. However, I’d prefer to stick within the Make.com ecosystem if possible for now.

What Would Be Ideal

The ideal solution would allow me to:
• Temporarily store webhook data without worrying about API limits.
• Retrieve the entire dataset in bulk efficiently for processing and updating Google Sheets.
• Stay within Make.com’s capabilities if possible, without needing extensive external integrations.

Questions for the Community

•	Has anyone tackled a similar use case with high-frequency webhooks and bulk data updates?
•	Are there better ways to aggregate and batch-process data within Make.com’s ecosystem?
•	Any creative solutions for temporary storage that work well with Make.com scenarios?

Thanks in advance for your insights and ideas!

And if I missed something please ask.

Hi @Si,

With Webhooks, do you need to send responses? If not, Webhooks will continue to accept data up to a limit depending on your plan.
You can then set up the Webhook to process X number of results to process in one cycle (enable Advanced Settings on the module to see this).

Then, you can schedule the scenario to run periodically, process X max number of results per cycle to match the max number of results you want to throw into Google Sheets.

I feel like you might be able to find a good balance here assuming you don’t need the scenario to run instantly according to when the webhook was called and if you don’t need it to send responses.

1 Like

Thank you for answering so fast!

Can we just clarify a bit more:

  • you propose to run at intervals and process webhook data in queue by doing several cycle at once
  • i would need to agregate (loop this initial cycle of accessing webhook queue) that data to an array?
  • And use add rows to push data to sheets?

What about an alternative approach that doesn’t rely on a somewhat complicated scenario but a rather very simple one.
Eg sending webhook data to bigquery and only displaying it in sheets?

Why I am asking:

  • there might be hours where 500 events fires very quickly → since this data would need to be processed by the scenario it seems a bit too costly to have it run a complex scenario. While I only need it as a log/later analyses and no action is tied to it (transactional/user action data).

Thank you again for your answer!