Hi guys I am trying to optimize my workflow here. My issue is that a lot of operations are used for both chains of the router. My data set can be huge so the workflow iterates each time for each module. My operations stack up quite quickly. Can you help me optimize this workflow?
Things I have tried: Bulk update module at the end and arrays just before to process the entire bundles. The issue I ran into is that I am trying to update non consecutive rows and the bulk update doesn’t have a way to dynamically distinguish which rows to update. I also tried to use the ‘Make an API call’ module with similar results. Any help would be greatly appreciated.
Hi @Reid_Freeman,
I’m not a 100% sure what you’re trying to achieve within the workflow itself.
You have a “Watch New Rows” and then later you have “Search Rows” again. Are you updating the new row ? Then you might not need the “Search Rows” module.
Generally it comes down to your business process: What should happen when? if you say 'every time I add a new row, 23 different rows should be updated individually" - then I feel like it’s just going to consume 23*n (# of modules) operations.
If your aim is to optimize your workflow and reduce the amount of operations consumed per run, you should just get rid of some unnecessary modules.
For example, if you want to update an existing row when a new row is added, you don’t need the search rows module.
Hi Richard,
Thank you for your feedback. I should clarify what this workflow does. I have another scenario that is pulling in scraped data and dumping it into my excel sheet. This scenario is just a cleanup tool with the use of chatGPT. The top chain converts job descriptions from HTML to Markdown because it it easier to read. The second chain then reads those results and gives me key metrics like salary, YOE, Keywords and Qualifications. The search rows modules are looking for specific things. In the top chain it is looking for text with HTML language in order to only grab new posts. The bottom chain is looking for blank data in the key words column to tell chatGPT to update new posts.
Overall my issue is that I usually get 10-30 new posts everyday it seems like. The markdown, chatGPT, JSON, and Update a Row modules all consume 1 operation per post. So in a given day my operations are between 64-154. As I mentioned earlier I would like to pass the results from markdown and chatGPT to an array to reduce the operational load of this scenario. I am just having issues with the Bulk Update and Make an API Call modules.
The first two modules are just triggers to my other scenario. It is just watching for an update to trigger a cleanup of my spreadsheet. These two modules could be removed or simplified, but its just a two operation cost, I am not concerned with these.
Hi @Reid_Freeman,
thanks for the explanation.
While I don’t fully understand everything, it sounds like you will always have the markdown or chatGPT+JSON parse for each row - so you will not save operation costs on those modules either way. You would only save on the “Update a Row” module.
Theoretically the Google Sheets API allows updates for different rows (API Documentation), however, it’s quite complicated to setup within Make. The API expects an array of changes, where you specify the range and also what is being updated.
It would be easier if you knew that all rows are going to be from e.g. row 10-40 but if there could be row 5, 9, 11, 33, 47-50… then it’s somewhat complicated to setup 
For me it depends on the value you get and if you want to learn more. Can be fun to dive into the Sheets API and update multiple cells successfully but for 10-30 operations per day it might not be economically helpful immediately 
Yeah I did a deep dive into the make API call module and the issue I ran into is if I use an array I had no way to parse specific rows for the array.
Here is sort of an example I want to do.
I would use an array for my results and pass them to the update module. My data is structured like so:
Columns B, J, K, L are the desired output locations and Row Number is based on the dynamic search rows modules. I don’t know of or a specific way to drop the desired data x into column Y Row Z if that makes sense. I did see the post about bulk update using API call but again I kind of run into the same issue. The columns are labeled as such Salary range, YOE, Key Words and Qualifications (B,J,K,L) respectively. I don’t know how to just map this data as one string to different columns.
I should mention that I am pretty new to programing and make in general, this is probably the main reason I am having my issues lol.
That’s really non-trivial to do! 
Here are my 2cents:
-
I think it would help you if you had a column in your Google Sheets which only contains the formula “=row()”. This way your search module would also output the row number of the specific row.
-
Before I’d setup the update module to work dynamically, I’d set it up statically to simply set a value in one or multiple columns to “test”. Once you’ve got that one working, you can work your way backwards (i.e. multiple columns, multiple rows…). → at the end you’ll use the array aggregator or JSON aggregator to get to your result but, again, I’d start as simple as possible.
-
I’m not a 100% sure how the bulk update module works under the hood, so I’d probably opt for the ‘Make an API Call’ module to have full control.
I’ll take another crack at this today and see if I can make some progress. To your point is it worth all this works to cut my operations down by 100, maybe not. I do however, like to learn and understand how it all works so it’s kinda fun for me to keep optimizing! Thanks for your suggestions!
1 Like