Hi there,
I’m newish to the no code automation world and could use some recommendations. I have the beginnings of an Airtable DB that will house Proposals, Purchase Orders, and Invoices. All proposals and Invoices will be in Shared Google Drives.
TL;DR: What’s the best approach to recursively watch all subdirectories and extract folder names (Client/Year/Type) from the file path? This seems like a common use case but can’t find a template.**
Goal:** Read all files from Google Drive subdirectories and populate Airtable with metadata.
Drive Structure: Shared Drive “Proposals” → Year (2025) → Type (Commissioning/Design) → Client folders
Airtable Fields: Proposal ID, Name, Client, Year, Type, File URL, Drive File ID, Extension, Create/Modified dates
Current Issues:
-
“Watch Files/Folders” requires drilling down to specific paths like /2025/Commissioning - can’t watch the root
-
“Watch All Files” gives “needs trigger” error despite having Modified date trigger
-
Missing Client/Year/Type data (can only get hardcoded values)
-
ChatGPT suggested Search Files/Folders with query id = "{{1.parents}}" but getting errors
Has anyone solved similar Google Drive→Airtable scenarios in Make? Or can point me to a template?
Hey @Carie_S,
you’ve already discovered that fetching all the folder structure isn’t the easiest task.
May I ask if you need a one-time sync into Airtable and then you trigger the folder creation through Make and by doing that you can more easily keep up to date with the folder structure OR do you/your time manually create the folders in Google Drive and thus you need to sync it on an ongoing basis?
Airtable does offer a direct Airtable ↔ Google Drive connection, I think it’s at least good to take a look if it would help you.
Depending on the answer of my above question, I’m happy to guide a little more into the right direction 
Best,
Richard
Founder of Simplified Webhooks - NoCode Webhooks for Airtable
Hey Richard!
Thanks for the response. Yes, I need an initial sync to log all the files currently there, then once we have all legacy data accounted for, updates would be triggered when a new file is dropped.
I have been working with the Airtable/Drive connection, and will keep trying to iterate on that.
I am really shocked that something that seems straight forward is proving so difficult.
Thanks,
Carie
Also, I am also thinking the trigger doesn’t need to be realtime, I think will be easier if I just set up the scenario to run on a schedule. It’s not like we are dropping proposals every day.
I see! That means you basically need both:
An Initial Sync and then also a continuous solution.
For the continuous solution you can probably use the module “Watch All Files” which triggers on a scheduled basis and delivers information if a new file has been created or updated. Then you need a recursive solution which you can do in Make if you use the general webhook module in a scenario as the trigger. If the created files has a parent you trigger the scenario to probably check if you have the parent folder already in your Airtable base. If this parent folder has a parent folder you trigger the same scenario with the parent folder ID → making it recursive.
For the initial sync you probably go the other way, start at the top level and then check for each folder if there are more files & folders in the folder. If it’s a folder you start the same scenario which lists files & folders again with the Webhook & HTTP combination 
You might find this thread helpful: Recursive operation - "While" function - Copy Google drive directory
It even contains links to more proposed solutions! 