Hi everyone — I’m automating a candidate search and evaluation scenario and I’ve run into a logic issue I can’t seem to fix.
What my scenario does
-
Watch new job requests in Google Sheets.
-
Use HH API to search and fetch multiple resumes in JSON format.
-
Parse the resumes.
-
Send each resume to OpenAI GPT for analysis.
-
Parse AI’s JSON result and save the evaluation back into Google Sheets.
The problem
The HH API returns an array of resumes. In my flow, after Parse JSON I use a Basic Feeder (module 5 in my blueprint) to pass the parsed data to OpenAI. But instead of processing each resume separately, OpenAI receives the entire array in a single run, and only does one combined analysis.
What I expected
I expected OpenAI to run once per resume — so if the API returns 10 resumes, OpenAI should be called 10 times, each with just one candidate’s data.
What I tried
-
Mapping the parsed JSON directly into OpenAI’s
usermessage. -
Feeding the parsed output into Basic Feeder (thought this would act like an iterator).
-
Testing different array paths.
-
Removing/re-adding modules.
-
Checked execution log: Parse JSON has an array, but downstream only runs once.
What I think is happening
The Basic Feeder is sending the whole array as a single bundle. Because I’m not actually using the Iterator module on the resumes array, the scenario never splits into multiple bundles for OpenAI.
Questions:
-
In my case, should I replace the Basic Feeder with an
Iteratoron the resumes array from module 52 (parse resumesx)? -
How exactly should I map the
Arrayfield in Iterator to ensure each resume is passed as its own bundle? -
After Iterator, can I pass the single-item output directly into OpenAI, or do I need an additional Parse JSON step for each?
Bonus
If anyone has a working pattern for:
HTTP → Parse JSON → Iterator → OpenAI → Parse JSON → Google Sheets
for processing multiple items individually, I’d love to see your mapping.


