CSV to JSON aggregation for Google Sheets

My goal here is to pick up a regularly emailed CSV file and transform it into a Google Sheet to update the contents. I did have a working mechanism that was finding and updating rows or adding new rows, but the data has grown now and it’s burning operations so I want to do a bulk update of a Google Sheet from a CSV in my inbox.

My dev scenario is picking up a small sample CSV from Google Drive and then using that to run the scenario.

I’m using the CSV Parse operation, feeding that into an Array Aggregator and that into a Create JSON before passing that to a Google Sheets make an API Call module.

The aim is to create JSON that looks like:

{
  "valueInputOption": "RAW",
  "data": [
      {
        "range": "Zap!A:D",
        "majorDimension": "ROWS",
        "values": [
          ["col1","col2","col3","col4"],
          ["col1","col2","col3","col4"],
          ["col1","col2","col3","col4"],
        ]
     }
   ]
}

I’m having problems getting the CSV aggregated into the values array as an array per row with each value in.

I’ve got the array aggregator mapped to the data element in the JSON object, and spitting the items into the values attribute:

But, then I get multiple data elements in the JSON, one per row, and they are all targetting A:W so I just keep overwriting the same row :frowning:

The JSON object I created looks like this:

And if I leave the array aggregator on Custom then when we hit this point it can’t convert the associative array I get out the back of it into the text elements, if I use collection then we have a problem as the data that goes to the google API isn’t in the right shape:
image

It feels like I should b edoing something like this:

But I can’t work out how to hav ethat array map into the values correctly.

So I’m obviously missing a point somewhere in how to make this conversion work correctly, any ideas?

blueprint.json (25.1 KB)

Closest to working blueprint I’ve managed.

Welcome to the Make community!

Could you also provide example CSV that would translate into the example JSON you provided?

1 Like

testdata.csv (664 Bytes)

Here’s some test data that’s totally synthetic I have just run through the provided blue print and demonstrates the issue.

Thanks,

Hmm, it looks like we can’t import/export data structures along with the blueprints, so I don’t think I can help with this.

1 Like

I guess the simplest definition of the problem is:

How do I turn a CSV into a JSON array of arrays in make.com?

Everthing else is straight forward, but, when I model any JSON object, I can only it seems map an Array Aggregator to the more structured aspects of the data.

given this CSV:

Orderno,Orderdate,Company
BSOM1002,2022-12-15,Shop 1
BSOM1002,2022-12-15,Shop 2

How do I get to this?

[
  ["Orderno","ORderDate","Company"],
 ["BSOM1002","2022-12-15","Shop 1"],
 ["BSOM1002","2022-12-15","Shop 2"]
]

In that case it’s very simple. You’ll need a minimum of one module — no Iterators, Aggregators, or Create CSV modules:

Output

Screenshot_2024-05-02_210541

Give it a go and let us know if you have any issues!

samliewrequest private consultation

Join the Make unofficial Discord server!

1 Like

Module Export

You can copy and paste this module export into your scenario. This will paste the modules shown in my screenshots above.

  1. Copy the JSON code below by clicking the copy button when you mouseover the top-right of the code block
    Screenshot_2024-01-17_200117

  2. Enter your scenario editor. Press ESC to close any dialogs. Press CTRLV (paste keyboard shortcut for Windows) to paste directly in the canvas.

  3. Click on each imported module and save it for validation. You may be prompted to remap some variables and connections.

View Module Export Code

JSON

{
    "subflows": [
        {
            "flow": [
                {
                    "id": 33,
                    "module": "util:ComposeTransformer",
                    "version": 1,
                    "parameters": {},
                    "mapper": {
                        "value": "Orderno,Orderdate,Company\nBSOM1002,2022-12-15,Shop 1\nBSOM1002,2022-12-15,Shop 2"
                    },
                    "metadata": {
                        "designer": {
                            "x": 14,
                            "y": -733,
                            "name": "CSV"
                        },
                        "restore": {},
                        "expect": [
                            {
                                "name": "value",
                                "type": "text",
                                "label": "Text"
                            }
                        ]
                    }
                },
                {
                    "id": 34,
                    "module": "util:ComposeTransformer",
                    "version": 1,
                    "parameters": {},
                    "mapper": {
                        "value": "[\n  [\"{{join(split(first(split(33.value; newline)); \",\"); \"\"\",\"\"\")}}\"],\n  [\"{{join(slice(split(replace(33.value; \",\"; \"\"\",\"\"\"); newline); 1); emptystring + \"\"\"],\" + newline + \"[\"\"\" + emptystring)}}\"]\n]"
                    },
                    "metadata": {
                        "designer": {
                            "x": 258,
                            "y": -731,
                            "name": "JSON",
                            "messages": [
                                {
                                    "category": "last",
                                    "severity": "warning",
                                    "message": "A transformer should not be the last module in the route."
                                }
                            ]
                        },
                        "restore": {},
                        "expect": [
                            {
                                "name": "value",
                                "type": "text",
                                "label": "Text"
                            }
                        ]
                    }
                }
            ]
        }
    ],
    "metadata": {
        "version": 1
    }
}

samliewrequest private consultation

Join the Make unofficial Discord server!

1 Like

Ah that’s interesting and very helpful thanks.

In the real data there are 23 columns, so I’m a bit worried about that set of nested join/split etc for maintainability if there are more columns added in the future.

Though - you have just taught me I can copy paste JSON to/from the scenario editor so I’ve been able to auto-generate some definitions rapidly in a macro environment.

I’ve got this now:

Slightly horrific still, as there are a pair of nested replace to deal with newline and quote marks in a Text constructor:

That is then composed into some text as valid JSON:

And then parsed against a JSON description:
image

The JSON I used to create the schema there was:

{
"values": [
["one","two","three"]
]
}

Then I can create the JSON:

And finally call the API with that:

Slightly elaborate, but perhaps a bit easier to maintain than the join/slice approach?

You can put 20 in 21’s JSON field directly, saving an operation.

1 Like

When I did that, I ended up with an escaped string literal in there which didn’t work in the final API payload. For some reason, putting that string in just didn’t work, hence parsing it into a JSON object and then using that. Not quite what I expected.

I needed this, went searching and found nothing satisfying, then discovered a good solution.

I’m webhooking in a CSV file with event registration data and need to convert each row to a separate collection inside of a JSON to send off to an API for some conversion and reformatting.

In my case, I’m taking the CSV and getting the file using HTTP and renaming/uploading/converting to Google Sheets (the first 3 modules).

This allows me to work with the file using Google Sheets modules.

From there, I’m using the Search Rows function in Google Sheets, using the fileid and name (for the sheet name) from the Google Drive upload.

This gives me each of the rows as a separate collection.

From there, aggregate to JSON using whatever your required custom data structure is.

In the future, if you add additional columns, the only thing you’ll have to update is your custom data structure in the aggregate to JSON module.