Should I use a filter to stagger GMB posts?

I have an automation where a client fills out a simple form (service performed, location it was performed in, image upload), it gets sent to Make from a webhook, pushes that info to ChatGPT to write a GMB update/post, then sends to my clients GMB profile as an update.

Here’s my question - they have several techs out in the field using this - I want to put at least 30 minutes in between each update that gets sent to GMB. Can I use filters to do this? Add a router?

I’ve seen tutorials using a data store, but that doesn’t seem to solve the problem of staggering the updates getting sent to GMB. If two techs send an update within 5 minutes of each other (which has happened), it seems that using a data store will only delay the updates, not stagger them.

Any help would be greatly appreciated - thanks :-).

Hi @David_D1 and welcome to the Make Community!

If you’re OK using the data store, put the data from the tech in the DS when they submit it. Then every 30/35/60 whatever minutes, have another scenario check the database and if something is in it, you put that information in GMB and delete the entry in the DS or you mark it as completed.

As long as you aren’t getting more submissions than the number of executions in a day, that should work.

L

1 Like

Thanks for getting back to me so quickly.

The reason I’m trying to avoid the data store is having to watch a bunch of these will eat up a lot of credits on my account.

Would this work - under sequential processing, marking it as ‘Yes’, then create a filter between ChatGPT and GMB that adds a 30 minute delay?

This way, if two team members trigger the automation at the same time, Make will process one with a 30 minute delay, then process the second one with a 30 minute delay - achieving what I was hoping for - staggered GMB posts.


Ah, yes credits! I heard about those…

Uhm… The filter doesn’t actually pause the execution for 30 minutes. That’s a mathematical operation. The condition you’re showing would possibly always return true. You’re checking for the existence of “the current time + 30 minutes.” I think that Make does a bit of time travel to see if the Earth will still exist in 30 minutes, and mostly returns TRUE. :smile:

OK, back to the problem at hand…

Here is another possibility: keep the sequential idea and after submitting to GMB, add 5 or 6 successive sleep modules. The sleep module allows for a maximum of 5-minute delays. So if you put 6 of them and your execution doesn’t last more than 10 minutes, your scenario will run in 40 minutes or less, which is the allotted maximum time. And it prevents two simultaneous submissions. It just costs you 5 or 6 extra credits per run.

L

1 Like

Perfect - yes, I’m new to advanced Make automations and have never used filters before so I was just guessing on that lol.

That makes sense - I had originally added the 5 minute delay modules but I didn’t do the sequential mode, so it didn’t solve my problem. Now with sequential turned to ‘Yes’, this should work. 5 or 6 extra credits is a small price to pay compared to have the data store watch every 30 minutes or so.

Thanks so much for your help!

1 Like

If the solution works, don’t forget to mark it as such so others can benefit.

L