JSON Data Extraction Issue – How to Loop Through Nested Arrays?

:wave: Hi everyone,

I’m facing an issue for 3 hours :rage: when trying to properly extract all values from a nested JSON object.

:pushpin: My automation goal:

• I’m scraping a website to collect data in JSON format.

• One of the key pieces of data is popularTimesHistogram , which contains occupancy rates (occupancyPercent) per hour (hour) for each day of the week (Su, Mo, Tu, etc.) .

:mag: Simplified JSON example:

"popularTimesHistogram": {
  "Su": [
    { "hour": 4, "occupancyPercent": 10 },
    { "hour": 5, "occupancyPercent": 7 },
    { "hour": 6, "occupancyPercent": 8 }
  ],
  "Mo": [
    { "hour": 4, "occupancyPercent": 17 },
    { "hour": 5, "occupancyPercent": 27 }
  ]
}

:arrow_right: Goal: Extract all values and display them in this format:

Su: 4h: 10% | 5h: 7% | 6h: 8%
Mo: 4h: 17% | 5h: 27%

:hammer_and_wrench: What I’ve tried so far:

:one: Simple request to retrieve each value

{{3.popularTimesHistogram.Su[].hour}}{{3.popularTimesHistogram.Su[].occupancyPercent}}

:x: Issue: It only returns the last value of the list.

:two: Using map() to loop through all values

{{join(map(3.popularTimesHistogram.Su; "hour"; "occupancyPercent"); " | ")}}

:x: Issue: It only returns the hours, but not the occupancy rates.

:three: Trying to concatenate values correctly

{{join(map(3.popularTimesHistogram.Su; "hour"); "h : ") & " - " & join(map(3.popularTimesHistogram.Su; "occupancyPercent"); "% | ")}}

:x: Issue: It only returns the occupancy rates, but the hours disappear.

:four: Using modules like Array Aggregator, ParseJSON, and converting to CSV , ask to chatgpt , ask to make ai

:x: Issue: Some methods remove data or don’t allow me to reconstruct the structure.

:bulb: My final goal:

I want to send this data to ChatGPT , so it can analyze multiple gyms based on their occupancy rates and recommend the best gym depending on the current time .

:arrow_right: Since each gym’s data is sent as separate bundles, I’ll likely store the results in Google Sheets first before sending them all together to ChatGPT.

:question: My question:

How can I properly extract all values from popularTimesHistogram (days, hours, and occupancy percentages) in a single request?

Thanks in advance for your help! :blush: :pray:

Hey Theo,

you can use an iterator over the array followed by a text aggregator to set up the data in the new way. Something like this for example
blueprint.json (11.1 KB)
I placed two iterator-aggregator combos cause the days are coming in a collection and not an array in your example. If they are in a array themselves then you will need an iterator to go through each day. A second iterator to go through the hours. Then one text aggregator to make the hours for the day, then a second aggregator to build the week.

But if you are sending this data to ChatGPT for analyzing, why not send it directly? Why do you need to transform it?

Hey Stoyan_Vatov,

Thank you so much for your reply and the example!

Here’s what I did:

First, I imported the blueprint alone and tested it—everything worked fine.

Then, I copied and pasted each module into my automation.

I modified the iterators to match my JSON output from Apify.

When I ran the automation, I got this output:

Su: 4h: 4% | 4h: 4% | 4h: 4% | 4h: 4% |

So, I asked ChatGPT to analyze what was happening between Module 1 (JSON parser) and Module 3 (iterator). After comparing the data, I noticed a slight difference—the key “popularTimesHistogram” was missing in the provided model. However, the real issue seems to be between the iterator and the text aggregator because the iterator outputs:

[
    {
        "hour": 4,
        "occupancyPercent": 4,
        "__IMTINDEX__": 1,
        "__IMTLENGTH__": 24
    },
    {
        "hour": 5,
        "occupancyPercent": 5,
        "__IMTINDEX__": 2,
        "__IMTLENGTH__": 24
    }
]

While the aggregator outputs:

“text”: "4h: 4% | 4h: 4% | 4h: 4% |

I tried modifying the text in the aggregator from {{3.popularTimesHistogram.Su.hour}} to {{3.hour}}, but now I get nothing at all.

I can’t figure out where I went wrong.

As for your question—

The full output from the Google scraper extraction is 10,890 characters, which means 76,230 characters for 7 locations. That would result in huge requests for ChatGPT to process.

Edit: However, I just tested it, and it comes out to 5,763 tokens per location, with a limit of 128,000 on GPT-4o. So I’ll give it a try later. I just hope it won’t get lost in all that text—because I’ve tried sending files via Victor Stores before, and it was a nightmare. :sweat_smile:

Edit 2 :

Actually, I can’t do it this way. If I want to send the data formatted like in the picture below, I end up with the same issue as before—where I only get the first row for each day.

How should I send it instead?

Can you share the full input bundle you are using? Or blueprint of the scenario?

It looks like its not iterating over the array but hard passing the first item from it.

For sure @Stoyan_Vatov , thank you again for your help !

Theo59310_Blueprint.json (33.2 KB)

In the text aggregator modules you should be mapping the data coming from the iterators, not from the array it self. This is why you were getting the same data.

1 Like

I just wanted to take a moment to sincerely thank you for your response. After struggling with this issue for hours, your help has been truly invaluable! :pray:

It’s always great to come across people willing to share their knowledge and lend a helping hand.

Thanks again for your generosity and support! :blush::rocket:

Wishing you an amazing day! :sunny:

1 Like