Optimize Knowledge File Usage in Make AI Agent (Avoid Reprocessing on Every Execution)

:bullseye: What is your goal?

Uploaded knowledge files should:

  • Be stored persistently (cached or indexed once)
  • Be reused across multiple executions without reprocessing

AI Agent should:

  • Reference stored knowledge instead of re-sending full content each time
  • Reduce redundant token usage
  • Improve performance and cost efficiency

:thinking: What is the problem?

Knowledge file is attached to the AI Agent

  • Every time the scenario runs:
  • The file is reloaded and reprocessed
  • Tokens are consumed again for the same static content
  • No persistent memory or reuse mechanism is observed

:test_tube: What have you tried so far?

In Claude AI Projects, you can:

  • Upload multiple files (PDF, text, code, etc.)
  • Store them in a project knowledge base
  • Reuse them across all chats without re-uploading

:backhand_index_pointing_right: According to official docs:

  • Files added to project knowledge are “used across all chats within that project”
  • Claude automatically applies those files as context in every conversation

Hey @ConceptsandBeyond_Ma !

Knowledge files are indexed once and stored, so they are not fully reprocessed on every run.

What happens each time is retrieval. The agent pulls relevant chunks from that stored knowledge and sends them to the model for that execution. That still uses tokens, but only for the retrieved parts, not the entire file.

So you get reuse of the indexed content, but not zero token usage per run.

Ethan Marcellus- Automation Expert at Tuesday Wizard | Top Make Solution Partner | Make Community Contributor

Hey there,

I don’t think you understood what happens when the Make agent ran.

You absolutely can upload multiple files and reuse them and you don’t reupload them at all.

What you saw in the scenario is the agent checking the file to pull the needed chunks of knowledge from it. And this is absolutely 100% what claude does as well, except they don’t show it to you.