Hi everyone,
I still class myself as a no-code noob here, so bare with me.
I’ve created a scenario where we take data from an API - it’s keyword ranking data.
So things like account, keyword, rank, date etc.
We also lookup the account name within a Make Data store, to get an ID, then apply that as a variable.
Finally, I want to store this within a Snowflake database.
I’ve created the scenario, and it’s working, but as we have around 600 keywords in each account, which means that with iterating, then doing the data store, I have used around 2400 opperations per run.
What’s worse is that I’ve so far limited this to 2 accounts, and I’ve got another 350 to do!!
I’d like to try and reduce my number of operations per run, but I’m not sure if/how to condense them.
If I can supply anymore in formation let me know.
You can definitely skip the Set multiple variables module and do the same directly in the next module.
And I’m not 100% on how Snowlfake works, but maybe you can use a Text Aggregator module to build the json first and then send only 1 upload request.
So it will be Iterator → Get a record → Text Aggregator (with the Iterator as the source module) → Execute SQL
and this way only the Get a record module will do the large amount of operations, the rest will only have 1 each.
And about the Get a record module - is there maybe some logic behind how the IDs are generated? Or are they just static IDs and there is no way to escape the searhing?