tokenizer-arena / docs /chat-template /tools_and_llm_response.md
xu-song's picture
update
ed8b0c6
---
title: 支持的工具列表 & llm返回的工具调用
---
- [tools调用](#tools调用)
- [LLM入参 = messages + tools](#llm入参--messages--tools)
- [LLM出参 = 工具名 + 参数](#llm出参--工具名--参数)
- [LLM入参示例:messages and tools](#llm入参示例messages-and-tools)
- [LLM入参示例:转化为 prompt 字符串](#llm入参示例转化为-prompt-字符串)
- [llama3.1-405b ⭐️⭐️](#llama31-405b-️️)
- [Hermes-3-Llama-3.1-405B ⭐️⭐️⭐️](#hermes-3-llama-31-405b-️️️)
- [llama4 ⭐️⭐️⭐️](#llama4-️️️)
- [mistralai/Ministral-8B-Instruct-2410](#mistralaiministral-8b-instruct-2410)
- [qwen3 ⭐️⭐️⭐️⭐️⭐️](#qwen3-️️️️️)
- [deepseek-r1 ⭐️⭐️](#deepseek-r1-️️)
- [deepseek-v3.1 ⭐️⭐️](#deepseek-v31-️️)
- [gemma-3-27b-it ⭐️](#gemma-3-27b-it-️)
- [grok-2 ⭐️](#grok-2-️)
## tools调用
```py
# openai chat api
completion = client.chat.completions.create(
model=model_name,
messages=messages,
tools=tools, # tool list defined above
tool_choice="auto"
)
# open response api
```
## LLM入参 = messages + tools
tools是支持的工具列表
一般都是json格式,也有非json格式
- json格式: s
- 非json格式: s
## LLM出参 = 工具名 + 参数
llm返回格式通常有两种,`pythonic``json` 格式
json response 示例 [llama3.2_json](https://github.com/vllm-project/vllm/blob/v0.10.1/examples/tool_chat_template_llama3.2_json.jinja)
```sh
please respond with JSON for a function call.
Respond in the format {"name": function name, "parameters": dictionary of argument name and its value}.
```
pythonic response 示例 [llama3.2_pythonic 非官方](https://github.com/vllm-project/vllm/blob/v0.10.1/examples/tool_chat_template_llama3.2_pythonic.jinja)
```sh
Please respond with a python list of the calls.
Respond in the format [func_name1(params_name1=params_value1, params_name2=params_value2...), func_name2(params)]
```
多数都采用的json返回格式。
## LLM入参示例:messages and tools
```py
messages = [
{
"role": "system",
"content": "You are a bot that responds to weather queries."
},
{
"role": "user",
"content": "Hey, what's the temperature in Paris right now?"
}
]
# chat_completion = client.chat.completions.create(messages=messages, model=model, tools=tools) # 可以这样调用 LLM
```
`json_schema` of tools
```json
{
"type": "function",
"function": {
"name": "get_current_temperature",
"description": "Get the current temperature at a location.",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The location to get the temperature for, in the format \"City, Country\""
}
},
"required": [
"location"
]
},
"return": {
"type": "number",
"description": "The current temperature at the specified location in the specified units, as a float."
}
}
}
```
## LLM入参示例:转化为 prompt 字符串
### llama3.1-405b ⭐️⭐️
```sh
<|begin_of_text|><|start_header_id|>system<|end_header_id|>
Environment: ipython
Cutting Knowledge Date: December 2023
Today Date: 26 Jul 2024
You are a bot that responds to weather queries.<|eot_id|><|start_header_id|>user<|end_header_id|>
Given the following functions, please respond with a JSON for a function call with its proper arguments that best answers the given prompt.
Respond in the format {"name": function name, "parameters": dictionary of argument name and its value}.Do not use variables.
{
"type": "function",
"function": {
"name": "get_current_temperature",
"description": "Get the current temperature at a location.",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The location to get the temperature for, in the format \"City, Country\""
}
},
"required": [
"location"
]
},
"return": {
"type": "number",
"description": "The current temperature at the specified location in the specified units, as a float."
}
}
}
Hey, what's the temperature in Paris right now?<|eot_id|><|start_header_id|>assistant<|end_header_id|>
```
- **入参**:
- **tools格式**: 支持的工具列表(`tools`) 是基于 json-schema 的,因为chat_template中采用的是 [`tojson(indent=4)`](https://github.com/vllm-project/vllm/blob/v0.10.1/examples/tool_chat_template_llama3.1_json.jinja#L48)
- **tools在prompt中的位置**: 拼在第一个user轮内容的前面。(system不变,其他轮message也不变)
- **出参**:
- **出参格式**: 返回的`respone` 要求是 json 格式,`respond with a JSON ... Respond in the format {"name": function name, "parameters": dictionary of argument name and its value}`
- **出参的工具解析**: ?
- **评价**
- 扣分项:
- 入参: `Environment: ipython`
- 入参: `Today Date`
- 出参格式: 返回内容没有 `<tool>` 包裹,不方便结果的解析。
- **推荐指数**: ⭐️⭐️
### Hermes-3-Llama-3.1-405B ⭐️⭐️⭐️
```sh
<|begin_of_text|><|im_start|>system
You are a function calling AI model. You are provided with function signatures within <tools></tools> XML tags. You may call one or more functions to assist with the user query. Don't make assumptions about what values to plug into functions. Here are the available tools: <tools> {"type": "function", "function": {"name": "get_current_temperature", "description": "get_current_temperature(location: str) -> float - Get the current temperature at a location.
Args:
location(str): The location to get the temperature for, in the format "City, Country"
Returns:
The current temperature at the specified location in the specified units, as a float.", "parameters": {"type": "object", "properties": {"location": {"type": "string", "description": "The location to get the temperature for, in the format \"City, Country\""}}, "required": ["location"]}} </tools>Use the following pydantic model json schema for each tool call you will make: {"properties": {"name": {"title": "Name", "type": "string"}, "arguments": {"title": "Arguments", "type": "object"}}, "required": ["name", "arguments"], "title": "FunctionCall", "type": "object"}}
For each function call return a json object with function name and arguments within <tool_call></tool_call> XML tags as follows:
<tool_call>
{"name": <function-name>, "arguments": <args-dict>}
</tool_call><|im_end|>
<|im_start|>system
You are a bot that responds to weather queries.<|im_end|>
<|im_start|>user
Hey, what's the temperature in Paris right now?<|im_end|>
<|im_start|>assistant
```
> 为方便查看,以上prompt手动变化了缩进。
- **入参**:
- **tools格式**: 支持的工具列表(`tools`) 是基于自定义格式的(类json-schema),详见 [chat_template](https://huggingface.co/spaces/xu-song/tokenizer-arena/blob/main/docs/chat-template/Hermes-3-Llama-3.1-405B/chat_template.tool_use.jinja#L42)
- **tools在prompt中的位置**: 额外增加了一个`system`轮,放在最前面。(用户设置的`system`保持不变),这个额外的system轮告诉大模型可以使用tools中的工具,并且指定返回格式
- **出参**: 返回的`respone` 要求是 `<tool_call>` 包裹的json
`return a json object with function name and arguments within <tool_call></tool_call> XML tags as follows: <tool_call>{"name": <function-name>, "arguments": <args-dict>}</tool_call>`
- **评价**:
- 扣分项: 入参的`tools格式`与json-schema的字段虽然一样,也类似json,但不是严格的json格式,通用性差。
- **推荐指数**: ⭐️⭐️⭐️
### llama4 ⭐️⭐️⭐️
```sh
<|begin_of_text|><|header_start|>system<|header_end|>
Environment: ipython
You are a bot that responds to weather queries.<|eot|><|header_start|>user<|header_end|>
Given the following functions, please respond with a JSON for a function call with its proper arguments that best answers the given prompt.
Respond in the format {"name": function name, "parameters": dictionary of argument name and its value}.Do not use variables.
{
"type": "function",
"function": {
"name": "get_current_temperature",
"description": "Get the current temperature at a location.",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The location to get the temperature for, in the format \"City, Country\""
}
},
"required": [
"location"
]
},
"return": {
"type": "number",
"description": "The current temperature at the specified location in the specified units, as a float."
}
}
}
Hey, what's the temperature in Paris right now?<|eot|><|header_start|>assistant<|header_end|>
```
跟 llama3.1差不多,只是少了`date`,并且换了`special_token`。(同样拼在第一个user轮)
- **评价**: 同 llama3.1
- **推荐指数**: ⭐️⭐️⭐️
### mistralai/Ministral-8B-Instruct-2410
```sh
<s>[AVAILABLE_TOOLS][{"type": "function", "function": {"name": "get_current_temperature", "description": "Get the current temperature at a location.", "parameters": {"type": "object", "properties": {"location": {"type": "string", "description": "The location to get the temperature for, in the format \"City, Country\""}}, "required": ["location"]}}}][/AVAILABLE_TOOLS][INST]You are a bot that responds to weather queries.
Hey, what's the temperature in Paris right now?[/INST]
```
- **入参**: 支持的工具列表(`tools`) 是基于 的,
- **出参**: 返回的`respone` 要求是
### qwen3 ⭐️⭐️⭐️⭐️⭐️
```sh
<|im_start|>system
You are a bot that responds to weather queries.
# Tools
You may call one or more functions to assist with the user query.
You are provided with function signatures within <tools></tools> XML tags:
<tools>
{"type": "function", "function": {"name": "get_current_temperature", "description": "Get the current temperature at a location.", "parameters": {"type": "object", "properties": {"location": {"type": "string", "description": "The location to get the temperature for, in the format \"City, Country\""}}, "required": ["location"]}, "return": {"type": "number", "description": "The current temperature at the specified location in the specified units, as a float."}}}
</tools>
For each function call, return a json object with function name and arguments within <tool_call></tool_call> XML tags:
<tool_call>
{"name": <function-name>, "arguments": <args-json-object>}
</tool_call><|im_end|>
<|im_start|>user
Hey, what's the temperature in Paris right now?<|im_end|>
<|im_start|>assistant
```
- **入参**:
- **tools格式**: 支持的工具列表(`tools`) 是基于 json-schema 的.
- **tools在prompt中的位置**: 拼接到原始`system`的结尾 + 额外的prompt (告诉大模型可以使用tools中的工具,并且指定返回格式)
- **出参**: 返回的`respone` 要求是 `<tool_call>` 包裹的json
`return a json object with function name and arguments within <tool_call></tool_call> XML tags:\n<tool_call>\n{"name": <function-name>, "arguments": <args-json-object>}\n</tool_call>`
- **推荐指数**: ⭐️⭐️⭐️⭐️⭐️
### deepseek-r1 ⭐️⭐️
```sh
<|begin▁of▁sentence|>You are a bot that responds to weather queries.<|User|>Hey, what's the temperature in Paris right now?<|Assistant|><think>
```
- **评价**:
- 扣分项: 不支持tools,tool_calls也有问题
- **推荐指数**: ⭐️⭐️
### deepseek-v3.1 ⭐️⭐️
与 deepseek-r1 一样
### gemma-3-27b-it ⭐️
不支持任何tool
### grok-2 ⭐️
不支持任何tool