Hello everyone,
I’m currently working on an automation scenario in MAKE to translate a large French product spreadsheet (30000 products with 6 columns) using the Chatgpt in Make with Excel. However, I’ve run into a significant issue:
Running the scenario on the entire spreadsheet will require roughly 180,000 OpenAI API calls, which is a considerable amount. The processing time for this volume of calls exceeds the runtime limits, causing the scenario to time out after 40 minutes.
Here’s a summary of the challenges:
-
API Call Volume: Approximately 180,000 API calls are needed to translate the entire spreadsheet.
-
Scenario Runtime Limits: The process takes too long, resulting in a timeout after 40 minutes.
-
Manual Intervention: To process the entire file, significant manual intervention is required, which is not feasible for our needs.
I am looking for advice or solutions on how to handle this. Specifically:
• Are there any best practices or techniques within MAKE to manage such large volumes of API calls without hitting the runtime limits?
• Would breaking down the process into smaller chunks be effective, and if so, how would you recommend structuring this?
• Alternatively, if someone has experience implementing a similar process with a Python script, I would greatly appreciate any guidance or instructions on how to set this up. Do you believe a script is better in this case?
Additionally, we anticipate facing the same hard-limit problem when scraping large websites. Any insights or suggestions on handling these scenarios would be highly beneficial.
Thank you in advance for your help!