When I upload 3 files from the http://studio.softr.io, the make.com webhook will return “FIELD__MSRQYLUOL:0”, “FIELD__MSRQYLUOL:1” and “FIELD__MSRQYLUOL:2”, how can I loop through the key one by one in make.com, and extract all the files’ link?
the problem is I never know the exact key of name, the only thing I know is it will increase +1 everytime, if upload 5 files then the name will be FIELD__MSRQYLUOL:5
As @samliew said, it becomes tricky when you don’t know the number of keys and the key names are subject to change too.
Another solution could be to use a text parser to extract the URLs from the object. This works if you know that the value always is a URL. You can enable JSON pass-through in the Webhook:
Thank you for your great suggestion. When I tried to rebuild your solution, however, it did not work. Therefore, I slightly adapted the implementation of the Text parser.
I used “Match pattern” (Regex) instead of “Match elements” with the following pattern:
https?:\/\/[^\s"]+
The output now gives me every URL nicely separated.
Hi @Alex15
Here is a lazy method I used after I tried all these solutions and they didn’t work for me.
I had a similar issue where I wanted to iterate through several bundles containing urls returned from Airtable. Due to the nature of my automation, it is impossible to know in advance what the number of fields is, and my key names could alternate between 2 possible values.
My use case was to loop through the URLs to create a new JSON object that would contain the links and a method.
So, what I did was to use the array aggregator to convert it into a list:
Finally, I passed this JSON string to ChatGPT with a prompt to format it in my desired format. I gave it an example of the outcome, and I had to give specific instructions. For example, I asked it not to add language tags (it was adding ‘’‘json {{the JSON object}}’‘’) to the output and just return it as is.
The output has been consistent so far and I have no complaints.
One thing I would be mindful of is the token limits if your dataset is particularly large. You might want to truncate the retrieval.