I have a Google Sheet containing all my user data (around 100 users = 100 lines). Each line contains one user, which then gets processed through a series of different APIs in the scenario. When one line is done, Make starts with the next line etc. I run this scenario on a daily schedule at a specific time.
This works reliably, but to get through all 100 users simply takes too long.
Therefore, I’m now wondering if there’s a way to set up the scenario to take all 100 lines and start processing them simultaneously?
Or, if theres no possibility to get them simultaneously from the Google Sheets, any easy way to submit them all to make at the same moment (because I guess, IF I were able to send all the data simulatenously via Webhook, the operations would all start. However, that seems counterintuitive since all my data is already in the Google Sheet (and the data is static)
Hi Leander,
When using webhooks Make scenarios can “multithread” i.e two or more copies of the scenario can be executing at once ( I know it is not the traditional sense of multi-threading)…but maybe you could build upon that to process your data faster
I may be talking out of my backside here, I am somewhat new to Make. It would seem that a router should do the trick. More accurately, setting some sort of differentiator for the router to route with would do it.
The differentiator is simply a Column of letters, next to a column of 3-digit numbers, combine the two to make an “id” that starts with a letter and continues with numbers.
The Router routes if the ID field contains an A, B, C, D, or E. I stopped there. I grabbed 10 columns at a time, and it seems to have processed them asynchronously.
Granted, I did not do a boatload of stuff on each pass, but the completion times support asynchronicity