Hi everyone
I’m building a scenario to analyze large HTML pages for CRM or ERP technology references using ChatGPT. The idea is to split the HTML into 10,000-character chunks (to stay within token limits), process each chunk individually with the OpenAI GPT module, then aggregate the results into a single summary.
Current Setup:
- HTTP module retrieves raw HTML from a website (1.body).
- Set Variable module attempts to split the HTML into 10,000-character chunks using: splitText(toString(1.body); 10000)
→ Output variable: Chunks
3.Iterator module is supposed to loop through the Chunks array.
4.Each chunk is sent to the OpenAI GPT module.
5.An Array Aggregator collects the GPT results.
6.Optionally, a final GPT call summarises them before storing in Google Sheets.
Despite the HTML being over 500,000 characters long, the splitText() function only returns 1 chunk. The Iterator then runs just once, and I eventually hit OpenAI’s token limit (~16k tokens) on the final GPT module.
Can anyone advise what I am doing wrong?
Thanks,
Darren
(p.s. - I am still new to make.com )