Using local setup LLMs in scenarios

Dear make.com-friends,

I am currently searching for solutions to use a locally running LLM in one of my scenarios.
OpenAI/Anthropic APIs are due to GDPR-compliance not possible. My client wants to host an LLM locally anyways, so I am trying to connect my scenario with it.

Does anyone have experiences with this topic and can help me? I would really appreciate it!
To test it before showing it to my client I am experimenting with LM studio and Llama 3.2 3B to get a feeling for it, but without success so far.

Solved my problem by using Langdock API which is GDPR-friendly and I can access couple of LLMs there:)

2 Likes