I started using DeepSeek for some of my scenarios and I was curious about this “Tools” thing that is in the advanced settings, does any one know how to use it? I didn’t find a lot of info about it
It’s kind of a placeholder for future functionality. The idea is that you can provide a list of “tools” for DeepSeek to use when answering your prompt.
In theory, those tools could be API calls to things like Make scenarios (using Custom Webhooks) to perform actions like looking something up in your CRM, or checking calendar availability, or sending an email.
But at present DeepSeek only allows you to specify calls to Python functions. That’s all well and good if you’re hosting the DeepSeek model on your own VM and calling it from Python. But it’s a redundant concept when you’re using the publicly hosted DeepSeek service.
When they extend it to cover API calls, then there’ll be some cool use-cases.
Thank you for reply! Definitely, it would allow a lot of cool stuff!
This was bothering me, so I did a little more digging. Turns out I was on the wrong track.
DeepSeek’s API (like that of many LLM services) simply mirrors the OpenAI API.
OpenAI’s developer docs have a lot more on using functions.
Essentially, the model indicates in the completion message object that it wants to call a function
. It’s entirely up to you how you handle that request - you could have a Make Router
and different paths for each function
.
You then return the results of the function
with specific parameters in a message
back to the model.
Ok that makes sense, thanks for your interest on this matter , this is super helpful.
Again, thanks a lot!