Gsheets - How to copy a range of values onto new sheet?

Hi all,

I have this google spreadsheet I use for work below.

I have two sheets in here - one is called “Template” and the other is “Sheet 1 Test”. I am trying to figure out how to take rows 1-31 (the entire rows) and “paste them” onto “Sheet 1 Test”.

The first issue I am running into, as per the screenshot below, is that there doesn’t seem to be an option, or a way to map, the entire row. I had to do the entire values instead. Is there a way to specify the entire rows 1-31 instead of A1-A31?

The second issue I am running into is trying to exactly paste those rows into “Sheet 1 Test”. This is the options I am faced with:

Instead of manually pasting each value, is there a way to just “copy paste” them in a sense?

My third question is - Is there a way to preserve the formatting (ie color, font, etc) when executing this copy/paste (if it is possible in the first place)?

Thanks!

1 Like

Hello :wave:t5:

I believe that using the standard “Search Rows” module might be a better option for your case.
You would not have to define the specific range (especially the number of columns).
You could simply output all rows with all their columns and then limit the row number(s) in the Make filter e.g. as in the picture below.


Regarding the need to map column values to the "Add the Row" or "Update a Row" module - this, unfortunately, is needed for these modules which can't copy-paste all data into a new/existing sheet.

In case you would not want to limit data to a specific row range and would want to simply copy-paste the whole sheet, you could use the “Copy a Sheet” module for that.


Cheerio :cat_roomba:

2 Likes

Hi rambizn,

For your second question about pasting rows from one sheet into another, i recently came across the same issue. I found the most efficient solution to be generating a string of values separated by a comma with a text aggregator, then using the Google Sheets Make an API call module to batch update the sheet.

In my scenario, I am deleting a large range of old data, and replacing it with new data from an air table. Using the make an api call, I’m deleting a large range of data and replacing it with new data that is aggregated using a text aggregator. Using the make an api call module instead of search row, and update row modules saves over 1000 operations per run.

Here is a link to the google docs on how to structure the request body.

And here is an example of the request body I’m using.

{
“requests”: [{
“deleteRange”: {
“range”: {
“sheetId”: 0,
“endColumnIndex”: 7,
“startColumnIndex”: 0,
“startRowIndex”: 1,
“endRowIndex”: 2000
},
“shiftDimension”: “ROWS”
}
},
{
“pasteData”: {
“coordinate”: {
“sheetId”: 0,
“rowIndex”: 1,
“columnIndex”: 0
},
“data”: “{{5.text}}”,
“delimiter”: “,”
}
}
]
}

Hope that helps!

5 Likes

Hi,

So… I have the same/similar requirement. I have a “Make a http request” which I put through an iterator and then text aggregator which I want then to put into a Google Sheet (and eventually a Zoho Books Bulk Upload). I used the “Make an API Call” Google Sheets actions with

  • URL as /spreadsheets/:batchUpdate, where My SpreadsheetId is my spreadsheet
  • Body as { “requests”: [ { “pasteData”: { “coordinate”: { “sheetId”: 687748420, “columnIndex”: 0, “rowIndex”: 1 }, “data”: “{{20.text}}” } } ]}
  • The spreadsheet is linked via the OAuth/

But I get /v4/spreadsheets/:batchUpdate was not found on this server. That’s all we know. as an error.

I have doublechecked the SpreadsheetId via the API documentation example/test where it shows that everything works… And just to be certain, I installed the make.com add-in to my google sheets…

Any ideas?

Cheers,
G

Hey there @gaboom and welcome to the community :wave:

I understand that you might be dealing with a similar problem, but this topic is marked as solved, which means that you are not likely to receive any answers here.

Remember, it is always better to start a new topic for a new question and add a link that will redirect users to this one. That way, you will receive an answer from one of our community members. :pray:

2 Likes