How to avoid 1 credit per item when sending Apify dataset (200 items) to Airtable in Make?

Hi everyone,
I’m running a scenario where Apify scrapes TikTok data and returns around 200 items in one dataset run. Each item contains multiple fields (caption, video link, metrics, etc.).

My workflow looks like this:

  1. Apify → Get Dataset Items (returns ~200 bundles and each bundles containing multiple fields like id, text, name, etc.))

  2. Airtable → Bulk Upsert Records (advanced)

The problem is:
If I send the dataset directly into Airtable, Make treats each item separately and charges 1 credit per record → so 200 records = 200 credits per run, which is too expensive.

Has anyone successfully handled bulk inserts to Airtable from Apify via Make without consuming one credit per item?

Any recommended module setup, aggregation method, or best practice would be greatly appreciated :raising_hands:

Hello,

Welcome to the community.

The problem you have is caused by the way Apify returns your data- it returns bundles. Make wants to process each bundle one by one- that’s the main logic of Make.com (Operations - Help Center)

You must aggregate them to simply merge them into one bundle.

But there is a tricky part. Airtable has a limitation of 10 rows per request.

Because of that, you must create “packs” of data containing exactly 10 bundles.

One of possible solutions:

Steps to be taken:
0. Adjust scenario to “Upsert” module. I used “Create” to simplify.

  1. Last aggregator must be connected to Airtable Bulk module (Records field):
  2. Aggregator’s “Target structure type” must be set to Airtable (you must follow order- firs set Airtable, then Aggregator).