Hi everyone,
I’m working on a scenario where webhook data arrives every second or more frequently, and I need to aggregate this data to prevent API throttling while sending bulk updates to Google Sheets. My current challenge is figuring out the best approach for temporary storage and batch processing, considering the limitations of certain tools and services.
Here’s what I’m dealing with:
• Frequency of Webhooks: Data is incoming every second or faster.
• Final Destination: The aggregated data needs to be stored in Google Sheets for further analysis.
• Main Challenge: Avoid hitting Google Sheets’ API limits by sending bulk data updates (e.g., every two hours) instead of processing individual webhook requests.
Options I’m Exploring
1. Temporary Storage:
• Built-in Data Stores: While Make.com offers data storage, it doesn’t support bulk retrieval of the entire dataset in one go. This makes it challenging to retrieve and process all stored data at once for bulk updates.
• BAD OPTION: Google Sheets as Temporary Storage: Using a separate tab or sheet for temporary storage could work, but frequent writes might still trigger API throttling.
• External Databases ?? Pricing…
2. Separate Scenarios for Aggregation and Updates:
• I’m thinking about splitting the process into two distinct scenarios:
• One scenario to handle and store incoming webhook data temporarily.
• Another scenario to trigger every two hours (or based on a time interval) to retrieve and send aggregated data to Google Sheets in bulk (set values/add rows as array).
3. Custom Scripting:
• If the above options aren’t feasible, I might consider using custom scripting outside Make.com to manage the aggregation and batching process. However, I’d prefer to stick within the Make.com ecosystem if possible for now.
What Would Be Ideal
The ideal solution would allow me to:
• Temporarily store webhook data without worrying about API limits.
• Retrieve the entire dataset in bulk efficiently for processing and updating Google Sheets.
• Stay within Make.com’s capabilities if possible, without needing extensive external integrations.
Questions for the Community
• Has anyone tackled a similar use case with high-frequency webhooks and bulk data updates?
• Are there better ways to aggregate and batch-process data within Make.com’s ecosystem?
• Any creative solutions for temporary storage that work well with Make.com scenarios?
Thanks in advance for your insights and ideas!
And if I missed something please ask.