Hi everyone, hopeing you can help. I’ve built a scenario that extracts data from BigQuery and then inserts to Google sheets via the Add a Row module. This works fine but uses a lot of operations. I saw the Bulk Add Rows module and I’m trying to set that up but keep getting the following error message in the Bulk Add sheets module: * The operation failed with an error. Function ‘extractBulkModuleOutput’ finished with error! Cannot read properties of null (reading ‘1’)
Some images of my workflow:
Any ideas on where I’m going wrong?
Thanks for your help.
David
Hi @David_Watters and welcome!
It looks like the input to the Bulk Add Rows is empty, so the module doesn’t know what to do.
Can you share the input/output bundles of your flow? I suspect BigQuery might be sending an empty result, but I’m not sure. If it isn’t, then the iterator or the aggregator is doing it.
L
1 Like
Hey @L_Duperval , thanks for such a quick response and the warm greeting!
Sure thing, not sure if there is an easier way to share them but I’ve just saved them all as text files and attached them (note I added a limit of 3 to the BigQuery query just for brevity and also only mapping the first 3 fields for testing):
Appreciate your help!
Aggregator Input.txt (617 Bytes)
Aggregator Output.txt (857 Bytes)
BigQuery Output.txt (84.9 KB)
Iterator Input.txt (39.6 KB)
Iterator Output.txt (28.3 KB)
Sheets Input.txt (1.1 KB)
bundle7.json (12 Bytes)
Update, I’ve been continuing to try and resolve. As I am still consistently receiving an error message I hadn’t actually checked the google sheet to see if the data had been imported. It turns out that the data is imported successfully but the module still errors. I’ve raised a support ticket with Make.com to investigate to see if they can investigate and resolve unless anyone else has a suggestion?
I haven’t been able to work on this much. I did get it to work on my end but I did the same thing you did, except I didn’t use the bigquery. I set the variables at the onset. Did you try using the sinple Google SHeets add row to test? If the insert also has a problem when doing the simple add row, then there may be a problem with what you are trying to add.
L
David_Watters_blueprint2024-07-29.json (103.0 KB)
Interesting, same error (Function ‘extractBulkModuleOutput’ finished with error! Cannot read properties of null (reading ‘1’)) when I try your blueprint and map it to the same google sheet that I’m trying to post to. Again though if I check the spreadsheet the data has been added to it.
After (a lot) of test runs I can see that the error is triggered when there is a header in column AA and beyond (columns A-Z run fine with a header the moment you add a header in Column AA the error is thrown).
Any ideas why this would be? Super weird…
WHen you create a new Google Sheet, it creates columns A-Z. Any other column needs to be added by hand. It might be related. I tried with 1000 rows and I don’t see that behavior.
L
No I don’t think thats it. See example screenshots:
This setup works fine column headers up to column Z:
This setup (with extra column header in column AA) creates error:
It definitely looks like a bug. Possibly in npm:
I ran my blueprint and it worked. I set it up to insert 30 columns and it worked. Then I started messing around with opening and closing the advanced settings and changing the value of the “formatted” option (because it was inserting and keeping the same format as the header). When I did that, it broke.
It’s possible that running with a configuration option and then disabling it caused the problem.
One way to test this hypothesis is to recreate the scenario manually from scratch (yeah, I know, a pain) and seeing if the problem appears.
L
Thats interesting. thanks for your investigations. I’ll try recreating from scratch see how that goes.
Unfortunately same error, created a new workflow from scratch (same settings), created a new google sheet to push the data into (Testing V2 - Google Sheets). I manually added column headers to row 1 up to column AA then ran the automation, it failed with the same error.
If I remove the value in AA1 (the column header value) the automation runs successfully. If I add it back in, it breaks
Hmm, then it looks like it’s a bug. Have you tried testing with different values of A-Z, A-ZZ, etc. to see if it changes anything?
L
Yeah no difference unfortunately, my guess is that the configuration setting for the column range defaults to A-Z regardless of what has been selected by the user. I’ve updated the ticket with make and hopefully they will be able to resolve soon.
Check out this thread: Use less operation when exporting to Google Sheet | Bulk add google sheet rows
You’ll grab a scenario JSON config so you can adapt to your use case.
Thanks Hugo, the workaround I’m using at the moment is using the error handler to continue the scenario (as the error doesn’t stop the data being transferred). I’ve got a ticket out with make support, if its not resolved in a few weeks I’ll look into shifting over to the solution you identified there.