My goal here is to pick up a regularly emailed CSV file and transform it into a Google Sheet to update the contents. I did have a working mechanism that was finding and updating rows or adding new rows, but the data has grown now and it’s burning operations so I want to do a bulk update of a Google Sheet from a CSV in my inbox.
My dev scenario is picking up a small sample CSV from Google Drive and then using that to run the scenario.
I’m using the CSV Parse operation, feeding that into an Array Aggregator and that into a Create JSON before passing that to a Google Sheets make an API Call module.
The aim is to create JSON that looks like:
{
"valueInputOption": "RAW",
"data": [
{
"range": "Zap!A:D",
"majorDimension": "ROWS",
"values": [
["col1","col2","col3","col4"],
["col1","col2","col3","col4"],
["col1","col2","col3","col4"],
]
}
]
}
I’m having problems getting the CSV aggregated into the values array as an array per row with each value in.
I’ve got the array aggregator mapped to the data
element in the JSON object, and spitting the items into the values
attribute:
But, then I get multiple data
elements in the JSON, one per row, and they are all targetting A:W
so I just keep overwriting the same row
The JSON object I created looks like this:
And if I leave the array aggregator on Custom
then when we hit this point it can’t convert the associative array I get out the back of it into the text elements, if I use collection
then we have a problem as the data that goes to the google API isn’t in the right shape:
It feels like I should b edoing something like this:
But I can’t work out how to hav ethat array map into the values correctly.
So I’m obviously missing a point somewhere in how to make this conversion work correctly, any ideas?