Introduction
The original idea for this project was to create a scheduler for emails. Allowing any of my Make scenarios to schedule sending emails at any point in the future. For example, sending 30 day reminder emails to possible subscribers, etc.
I then realized what I was creating was applicable to any future task - not just sending emails. Such as sending slack messages, posting on Facebook or Twitter, updating a CRM, etc. The scope is limitless.
I wanted a solution that was…
- Universal : so it can be called and reused from any scenario
- Flexible : supporting any future action type - not just sending emails
- Unlimited : able to schedule any task at any point in the future
- Simple : once setup - it should be straightforward to use and easy to understand
- As low cost as possible : in terms of Make operations (see Part 3 for more details)
I think what I ended up with is fairly neat - and hits most of these goals.
How this works is best demonstrated using a simple example: delaying emails until some point in the future. This is a common request in the community forum - one to which there hasn’t been a great solution. Hopefully, this might help.
Note : The solution requires a basic understanding of things like webhooks and JSON formatting - so it might not be one for Make beginners. However, I do provide all the files and blueprints for the example shown. So, hopefully, most users will be able to implement this without too much difficulty. See Part 3 for how to set this up from scratch.
This post is broken into three parts. The first provides an overview of how this works in practice. The second part goes into the details of how the scheduler is put together - so will be a bit more advanced. However, the scheduler is only created once - and can then be used anywhere - so these complexities are a one off. The last part has some final thoughts and goes over the steps required to get this all working.
There’s a fair amount to get through - so grab a cup of coffee and lets get going…
Part 1 : A Simple Email Scheduler
To show how this works in practice I’ve created a simple email scheduler. This allows any scenario to schedule an email to be sent out at any time in the future - days, weeks or even months from now.
There are three parts to implementing this - the scenario that wants to send the email, the scenario that actually sends the email, and a simple datastore that allows these two to communicate. In this example I’ll be using a Google Sheet as the datastore - although any other datastore could be used.
We’ll go through these parts in reverse order - as the implementation details are dictated by the part actually doing the work - so sending the emails in this particular case.
Note : The example provided is in its development form - so there’s lots of additional checks, status logging etc. Also, the steps are broken up to try and make them easy to understand. I discuss the costs of this later on in Part 3 - but I’d expect a production version to be much tighter.
The Backend Scenario
This is the final part of the puzzle - the scenario that actually sends out the emails. It’s implemented as a simple webhook.
The scenario has no need to know anything about the scheduling. It just needs to send an email. The complexities of scheduling the webhook call are hidden by the scheduler (see Part 2).
To send out an email - the scenario needs three things…
- The recipients email address
- The email subject line
- The email body (as HTML)
This information is kept in the datastore (covered in the next section) - so the webhook must be passed some sort of key to allow it to lookup this information. For this we will be using the Make UUID (Universally Unique Identifier) as it’s always guaranteed to be unique.
Here is what the complete backend scenario looks like…
So lets go through it…
- This is the webhook trigger. The only thing the trigger provides is the UUID.
- Next it uses the UUID to look up the correct email details in the Google sheet datastore (see the next section)
- It checks if the job has been cancelled - a lot can happen in 30 days
- If the job is still on - it sends the email
- If that works it logs the time and date in the datastore
- If the email fails - it also logs that in the datastore
- Lastly, if the job was cancelled - it updates the datastore confirming nothing was sent
Note that the final webhook responses are useful for debugging - but would be removed in production - as there’s nothing looking at the results. All the error logging is done via the Google sheet.
The Datastore
This is just somewhere that we can use to store information. You could use one of the built in Make datastores - but these are pretty limited so I tend to avoid them. In this example we’ll use a Google Sheet as they are free and ubiquitous. Feel free to use something else like Airtable.
This is how I’ve set the data up. There are many ways to do this - so don’t feel that you have to follow this example exactly…
- The first four columns are about control. They allow me to see at a glance the status of each email.
- The next three columns are the three parts of the email message. The entries here are just part of my testing.
- The final column where the backend scenario puts its results (Steps 5, 6 and 7 in the backend scenario). If it’s blank then the email hasn’t been sent yet.
- The Cancelled column allows me to stop an email from being sent (see Step 3 in the backend scenario)
- If the email address is invalid (as shown here) the last column will show the error. I could have checked the email using something like regex - but it was easier to just trap the failure in Step 4 above.
The Frontend Application
This is the scenario that needs to send out a delayed email. The main body of the scenario could be doing anything - that part is not important. I’m presuming we are at the point that it knows the recipient’s address, the subject line and body contents … and the delay required.
My example takes up from that point.
This is what it needs to do…
- This step is only required for testing - as it’s where I create the test emails. I set a recipient, subject, body and delay - just as the production scenario would have.
- The scenario then saves all these details to the above datastore - for retrieval by the backend webhook
- It then creates a simple JSON package containing the delay required and the webhook URL of the backend scenario. These two bits of information are all the scheduler needs - it doesn’t need to know anything about what is being scheduled. The full details of this are covered in Part 2.
- It then calls the scheduler webhook - passing along the JSON data.
- If the call fails it does nothing. The fact that the datastore wasn’t updated will indicate that the email was never sent. I’ve done a lot of testing sending 100s of emails - and this never happened.
- If the call works - then it updates the datastore
Summary So Far
That’s everything that has to be done each time you want to use this system to schedule a new activity…
- Create a backend webhook to do the activity - send the email, post to the blog, update Twitter, etc.
- Create a datastore to hold the information.
- Add four steps to your main scenario.
Simply rinse and repeat.
Part 2 : A Universal Scheduler
This is where the magic happens. Remember - you only have to create the scheduler once and I provide the blueprints and and example datastore to make this as easy as I can.
So it might be time to refresh your coffee and buckle down…
The Workers
The scheduler is just another webhook. You might remember that its passed a simple JSON file containing the required delay and the webhook URL of the backend scenario that will actually send the emails (as covered in Part 1).
What the scheduler needs to do is create a brand new worker scenario that’s scheduled with the required delay and is setup to simply call the webhook URL of the backend scenario.
Note : I call these temporary scenarios that the scheduler creates - worker scenarios. This is purely a notation that makes sense to me. There is nothing that special about them.
To keep things organized - I keep all of these workers in a separate folder - as there will be as many of them as there are outstanding emails. My current folder looks like this…
- The folder is called Scheduler Workers and it currently has 12 worker scenarios in it
- The top 9 of these are still active - which means that the emails have not yet been sent
- The last 3 are disabled - this means that these emails have been sent. These are automatically deleted after 24 hours so that the folder doesn’t just keep growing.
Don’t worry if the scheduler names look confusing - this will all make sense as I go through how the scheduler works.
The Main Scheduler Process
Lets dive in…
- This is the webhook receiving the JSON file from the frontend scenario. The JSON file contains the required delay and the URL of the backend.
- The first thing it does is save these details to its own datastore (this is a different datastore from the one covered in Part 1). This is covered in the next section.
- It then creates a blueprint for the worker scenario it’s going to create. It took me a few attempts to get this part right.
- It then creates the the worker scenario using this blueprint - scheduling it with the required delay.
- If the creation worked - it turns it on
- Then it updates its own datastore with the details and returns
- If the worker scenario failed to turn on - it logs this
- If the worker scenario couldn’t be created - it also logs this
In practice - once you have set all this up and tested it - parts 7 and 8 never seem to happen. They are there just in case and to ensure if something does go wrong there will be a log of it.
I provide all of this is the attached blueprints - so you should be able to get this working without too much effort. See Part 3 for more.
The Scheduler Datastore
The scheduler need to keep track of what it’s doing - it does this using its own datastore. Again, I’m using Google sheets - but any datastore could be used.
Notice that there is one line for each activity (sending emails, updating blog posts etc). IN this example - the lines match those in the datastore used by the frontend scenario covered in Part 1.
Going through this column by column…
- This is the same UUID used in Part 1 - and ensures each activity has a unique key
- The Target_Webhook is the URL of the backend scenario. In the above case these are all the same as there is currently just one backend that sends out emails. In the future there might be multiple of these - updating blog posts, posting on Twitter etc. Each would have its own webhook URL.
- The Job_Type is just there to make it easier to differentiate the different type of jobs. It this case they are all emails. However, when there are multiple activities being scheduled - this should help when working out what’s happening.
- Target_Run_Time is the time that the activity should take place - e.g. when the email should be sent
- Scheduled is TRUE if the worker was created and scheduled (update in Step 6 above)
- This is the name of the worker scenario. You’ll see that these match the scenarios in the Scheduler Workers folder covered above.
- This is the scenario ID that Make gave the worker - this is used later to delete the scenario once it’s no longer required.
- Deleted is a flag that indicates that the worker has been deleted. Note that there are 17 rows in the above sheet - 5 of which have been deleted. This leaves 12 active workers - the same number as in the Scheduler Workers folder.
Cleaning Up
As the emails are scheduled and new workers are created - the number of worker scenarios will steadily increase. You could manually go through and delete the worker scenarios once they have finished. However, it’s much easier to create a new scenario that will automatically do this for us…
As this whole system was going through it’s testing phase - I decided to delete the workers in two steps. Firstly I mark them as “(finished)” and then after 24 hours I actually delete them. You might want to do this in one step - and that’s fine.
Looking at the above…
- I retrieve a list of all the worker scenarios that have finished (they are inactive)
- I then look to see how long its been since they ran
- If it’s over an hour - I add the string “(finished)” to the worker’s name
- If it’s over 24 hours - I delete them.
- Once deleted it looks up the entry in the scheduler datastore
- And marks it as deleted (column H).
Part 3 : Wrapping it all up
That was a lot to get through - and if you’re still reading - well done! There are a couple of final things that I need to cover. I’ll keep them brief.
The Costs
Nearly every step of a Make scenario costs one operation - and there’s a limit to the number of these operations (ops) you can do in a month. So for most of us - it’s important to be as efficient as possible - especially if you are sending out thousands of emails - these ops will soon add up.
What I’ve outlined above is intended to be a starting point - and I acknowledge that it’s not efficient. I’ve traded cost for clarity. However, once you get all this working - you should start packing it down so that the number of operations is minimized.
In the above example - the number of additional ops required for scheduling each email is around 14. I have an optimized version that uses 10. However, if you only need to send out scheduled emails - and don’t need the scheduler to be universal - you can get rid of the scheduler and put all the logic in the frontend and do it in just 6. Getting it lower than that would be a challenge!
So sending out a 100 emails will cost somewhere between 600 and 1,400 ops.
Recreating My Example
I’ve been through the process of recreating all of the above a couple of times and have a few tips to make it a bit easier.
Note : I’ve tried hard to make these steps as clear and complete as possible. However, there might be some steps that work out differently for you. If this is the case - please add a comment below - and I’ll try and update this - so that it’s complete.
-
First add the Universal Scheduler.xlsx spreadsheet to Google sheets (or wherever you decide to put it). The tab called Email Data is the datastore for the frontend/backend and Scheduled Jobs is the datastore for the scheduler.
-
Next create two folders in Make - one to store the four main scenarios you’ll be creating and one to store all the worker scenarios. I’ve called these Scheduler and Scheduler Workers - but you can call then anything.
-
Next create a new scenario in the Scheduler folder and import the backend webhook blueprint (blueprint.email_backend_webhook.json).
-
Open up the first module - Webhook Trigger - and create the webhook. I called mine Example Email Backend Webhook. Make a note of the webhook URL - you will need it later.
-
Update the four Google modules to point to the spreadsheet. If you don’t already have a connection to Google sheets you’ll have to create one. Note that the first Google module - Fetch Email Details - will have an entry called 1.Unique_ID - this is OK - ignore it for now.
-
Update the Send Email module to use your own email details. If you’ve not done so already you will have to create an email connection to whatever system you use to send emails out via. Hopefully you have this setup already.
-
Turn on the scenario so that it runs Immediately as data arrives. Then save the scenario for now. One down and three to go.
-
Create another new scenario in the same folder and import the frontend example blueprint (blueprint.email_frontend_example.json).
-
Like before - update the two Google references.
-
Open the Create Email Example module at the start and put your email address in the Email_Address variable value and set a delay for Delay_Mins - I typically start with 1 - meaning 1 minute delay. You might want to update the formula in Email_Subject so that the same delay is shown.
-
Then open the Create JSON module and select Create a data structure. Give it a name - for example Universal Scheduler Data Structure and then add four items…
- Name : Unique_ID, Type : Text, Required :Yes
- Name : Target_Webhook, Type : Text, Required :Yes
- Name : Delay_Mins, Type : Text, Required :Yes
- Name : Job_Type, Type : Text, Required :Yes
-
Make sure Strict is set to Yes at the bottom and then Save the data structure.
-
You should now be able to add the URL of the backend webhook you saved in the previous steps to the Target_Webhook field. Once done close the JSON module.
-
Save and exit this scenario for now. We can’t fill out the call to the Universal Scheduler - as we’ve not yet created it! So…
-
Create a third scenario and import the scheduler blueprint (blueprint.universal_scheduler.json). This might look complicated - you’ve come this far - you’ll be fine!
-
Open the Receive Next Job module at the start. Create the webhook - I called mine Universal Scheduler Webhook (because I’m wild like that). Make a note of the URL - as you’ll need it in a minute.
-
Then update the four Google modules to point to the Scheduled Jobs sheet. Note that again - the first module will have some unresolved entries - ignore for now.
-
Open the Create Worker module. If you don’t have a Make Connection you will have to create one at this point. The Environment URL suggested will be something like https://eu1.make.com - HOWEVER this didn’t work for me - I had to use https://eu2.make.com
-
For the API Key - open up a new browser tab and go to your profile in the Make console. There should be an API tab…
-
Click on Add token - give it a name and select the following Scopes…
connections:read connections:write organizations:read organization-variables:read scenarios:read scenarios:run scenarios:write
-
IMPORTANT copy the long key before you click off this page - as next time you come here you will only be able to see part of the key. The key will look something like this 7f94c90b-1774-4e25-9aa6-60bca98d171b (but obviously different!).
-
You will have to fill out your own Organization ID and Team ID and the workers folder you created at the start.
-
Open the Turn Worker On module - and select the Make Connection you just created.
-
Save the scenario and turn it on - so that it runs Immediately as data arrives. Just one more to create!
-
Create the final new scenario and import the scheduler cleanup blueprint (blueprint.universal_scheduler_cleanup.json).
-
As usual - update the two Google modules.
-
Open the Fetch Inactive Workers module - and fill out your own Organization ID and Team ID and the workers folder you created at the start.
-
Open the Mark as Finished and Delete Scenario modules - and select your Make Connection.
-
We are nearly ready to run - just one last thing.
-
Reopen up the frontend example scenario. Open the Call Universal Scheduler module and paste in the URL of the scheduler webhook at the top. Save the scenario.
-
We can now run the frontend scenario - so click on Run once. It should run without any errors and update the Google sheet - Email Data. Check that there’s a new row and the details look OK.
-
Open the universal scheduler. Look in the History section - there should be a Success there. Check the Google sheet Scheduled Jobs. There should be a new line. Also check that there is a worker scenario in the Make folder. The worker should run after 10 minutes (the default delay I put in the frontend - or whatever you changed it to).
-
After the delay - you should get the email. Both Google sheets should be updated and the worker scenario should be disabled. Check all these.
-
If there are issue - you can look in the History sections of the scenarios to debug. One of the best ways to test is to put all the webhooks on manual and then process the calls manually so you can see the scenarios actually run.
Final Thoughts
When recreating this - be patient - each step is relatively simple - there’s just a lot of them.
Lastly, feel free to hack this as much as you want and really make it your own! There’s quite a lot of work that went into this post - and I really hope some of you will benefit from it
The Files
blueprint.email_backend_webhook.json (129.9 KB)
blueprint.email_frontend_example.json (53.8 KB)
blueprint.universal_scheduler.json (97.7 KB)
blueprint.universal_scheduler_cleanup.json (72.7 KB)
This is the spreadsheet - I’ve had to rename it with an .XLX extension to get it to upload. When you download it - change the extension to .XLSX before uploading into Google Sheets…
Universal Scheduler.xlx (87.7 KB)