如何用Ollama调用工具
Ollama在2025年5月的官方文档中发布目前最新版本的软件可支持tools工具调用,而且列出了目前支持tools的模型。

Ollama也一直更新Tools Calling的功能,包括现在也支持MCP协议了。工具调用使得LLMs不仅能输出文本,而且能够调用Python等函数,从而能够执行编程语言能完成的任何操作。官方文档举了一个例子:
import ollama
# Define the python function
def add_two_numbers(a: int, b: int) -> int:
"""
Add two numbers
Args:
a (set): The first number as an int
b (set): The second number as an int
Returns:
int: The sum of the two numbers
"""
return a + b
from ollama import chat, ChatResponse
messages = [{'role': 'user', 'content': 'what is three minus one?'}]
response: ChatResponse = chat(
model='qwen3',
messages=messages,
tools=[add_two_numbers], # Python SDK supports passing tools as functions
stream=True
)
for chunk in response:
# Print model content
print(chunk.message.content, end='', flush=True)
# Print the tool call
if chunk.message.tool_calls:
print(chunk.message.tool_calls)
我们来分解下以上的步骤:
定义工具
In Ollama, a tool refers to an external function or capability that a language model can call dynamically during a conversation or task execution.
定义工具其实就是定义一个函数。函数def
之后会有一个docstring
,功能就是描述这个函数。Docstrings are typically enclosed in triple quotes (""" or ''') and can span multiple lines. They are accessible via the function's __doc__ attribute or tools like help(). 如下这个函数:
def add_numbers(a, b):
"""Adds two numbers and returns their sum.
Args:
a (int or float): The first number.
b (int or float): The second number.
Returns:
int or float: The sum of a and b.
Example:
>>> add_numbers(2, 3)
5
"""
return a + b
docstring
的作用如下:
- Purpose: What the function does (e.g., "Adds two numbers and returns their sum").
- Parameters: The inputs, their types, and purpose (e.g., a and b as numbers).
- Return value: What the function returns and its type (e.g., sum as int or float).
- Examples (optional): Usage examples to clarify behavior.
- Additional info (optional): Exceptions, side effects, or notes.
In the context of Ollama's tool calling (also known as function calling), a Python function's docstring plays a critical role in defining how the language model (e.g., Llama 3.1) understands and interacts with the function as a tool. When a function is passed as a tool to Ollama’s chat API, the library parses the docstring to generate the tool’s JSON representation, which the model uses to decide when and how to call the function.
例如上面一个函数,Ollama的API在调用函数的时候会根据docstring
的描述生成如下的JSON格式:
{
"function_name": "add_numbers",
"description": "Adds two numbers and returns their sum.",
"parameters": [
{
"name": "a",
"type": "int or float",
"description": "The first number."
},
{
"name": "b",
"type": "int or float",
"description": "The second number."
}
],
"returns": {
"type": "int or float",
"description": "The sum of a and b."
},
"example": {
"input": "add_numbers(2, 3)",
"output": 5
}
}