We are currently building a large Scenario, that takes all Participants of a Zoom-Webinar, loops through them and generates PDF-Certificates for all of them and afterwards send them per SMTP-E-Mail to the participants.
Now we have the problem, that the Mail-Service has a Limit of 10 Mails / Minute. So we added a Sleep to prevent this from happening.
Now, we have another problem: With around 500 participants (500 Bundles in the iterator) we exceed the maximum execution time of 40 minutes.
In another thread I saw a suggestion to store incomplete executions and catch the error with a “Break” error handler. Sadly, this does not work because if the Break happens inside of an iterator, when we restart the incomplete execution it restarts the whole iterator starting from bundle 1.
Does anyone have any suggestions and ideas to overcome this problem?
Maybe someone has a better solution to this than what I can provide at the moment. I need further information on how the scenario has been set but, what you can do is,
a) Create a separate scenario where you can have the MailService that sends the email, The scenario will be initiated through webhook, so, Put a Custom Webhook Module as the starting module
b) Copy the webhook URL from the new scenario and then Use an HTTP module in the existing scenario to trigger the other sub-scenario. This will go after the Iterator that will send all the details required for the MailService, like email and other necessary stuff required
Now, once these are set up it will make sure that your original scenario will not reach the max execution time. Now to prevent the Mail Rate limit, enable sequential processing in the webhook scenario, which will make sure that it will pick up a single webhook at a time and won’t pickup the next one unless the last one has been completed. This way you can lower the sleep module time.
Like I said, I am not sure how your overall scenarios looks like, we can possibly utilize datastore for this as well. But, Just review and see if it fits what you are looking for.