Has anyone tried integrating LM Studio as a module?

I am new to the world of make and automation in general, and newer to use of AI. However, I am trying to set up a scenario where you can run a prompt through a set of GPTs, output to a Variable and to a GDoc. Then I plan to aggregate all the responses and run through Chat GPT to have it analyze and try to choose the best one or maybe take the best from each of the responses and merge into a single doc maybe??

Anyway, I would like to try and use LM Studio as one of my GPTs. It is not in the module list. I am wondering if anyone else has tried or knows of a LM Studio module that is available?

I appreciate any insight you can share.
Thanks!

Hey @Kevin_Morris

run a prompt through a set of GPTs, output to a Variable and to a GDoc. Then I plan to aggregate all the responses and run through Chat GPT to have it analyze and try to choose the best one

Yep, you should have success + fun with that.

use LM Studio as one of my GPTs

So the trick here is that LM Studio is not on the public internet for the us1.make.com servers to access. For LM Studio/Ollama etc, perhaps try this blog post I googled which talks about using ngrok to expose the local LLM server port to the internet as a public URL.

Then you’d use the ngrok URL and talk to the local LLM via a Make.com HTTP module to make the API calls.

Once you’ve figured all that out, the limitation will be: your make scenario can only successfully run if your laptop is open and ngrok tunnel is running.

Another option to access a wide variety of LLM models might be openrouter.ai or huggingface. Both offer APIs for inference. Again, use the HTTP modules to invoke their APIs as I don’t think there are native modules for them.

Thank you very much for the response. After reading this, it seems this is a bit outside of my experience level at this point. I will have to continue to dig into this and figure some things out first. Appreciate you taking the time to respond.