Spaces:
Running
Running
A newer version of the Gradio SDK is available:
5.47.0
metadata
title: 支持的工具列表 & llm返回的工具调用
tools调用
# openai chat api
completion = client.chat.completions.create(
model=model_name,
messages=messages,
tools=tools, # tool list defined above
tool_choice="auto"
)
# open response api
LLM入参 = messages + tools
tools是支持的工具列表
一般都是json格式,也有非json格式
- json格式: s
- 非json格式: s
LLM出参 = 工具名 + 参数
llm返回格式通常有两种,pythonic
和 json
格式
json response 示例 llama3.2_json
please respond with JSON for a function call.
Respond in the format {"name": function name, "parameters": dictionary of argument name and its value}.
pythonic response 示例 llama3.2_pythonic 非官方
Please respond with a python list of the calls.
Respond in the format [func_name1(params_name1=params_value1, params_name2=params_value2...), func_name2(params)]
多数都采用的json返回格式。
LLM入参示例:messages and tools
messages = [
{
"role": "system",
"content": "You are a bot that responds to weather queries."
},
{
"role": "user",
"content": "Hey, what's the temperature in Paris right now?"
}
]
# chat_completion = client.chat.completions.create(messages=messages, model=model, tools=tools) # 可以这样调用 LLM
json_schema
of tools
{
"type": "function",
"function": {
"name": "get_current_temperature",
"description": "Get the current temperature at a location.",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The location to get the temperature for, in the format \"City, Country\""
}
},
"required": [
"location"
]
},
"return": {
"type": "number",
"description": "The current temperature at the specified location in the specified units, as a float."
}
}
}
LLM入参示例:转化为 prompt 字符串
llama3.1-405b ⭐️⭐️
<|begin_of_text|><|start_header_id|>system<|end_header_id|>
Environment: ipython
Cutting Knowledge Date: December 2023
Today Date: 26 Jul 2024
You are a bot that responds to weather queries.<|eot_id|><|start_header_id|>user<|end_header_id|>
Given the following functions, please respond with a JSON for a function call with its proper arguments that best answers the given prompt.
Respond in the format {"name": function name, "parameters": dictionary of argument name and its value}.Do not use variables.
{
"type": "function",
"function": {
"name": "get_current_temperature",
"description": "Get the current temperature at a location.",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The location to get the temperature for, in the format \"City, Country\""
}
},
"required": [
"location"
]
},
"return": {
"type": "number",
"description": "The current temperature at the specified location in the specified units, as a float."
}
}
}
Hey, what's the temperature in Paris right now?<|eot_id|><|start_header_id|>assistant<|end_header_id|>
- 入参:
- tools格式: 支持的工具列表(
tools
) 是基于 json-schema 的,因为chat_template中采用的是tojson(indent=4)
- tools在prompt中的位置: 拼在第一个user轮内容的前面。(system不变,其他轮message也不变)
- tools格式: 支持的工具列表(
- 出参:
- 出参格式: 返回的
respone
要求是 json 格式,respond with a JSON ... Respond in the format {"name": function name, "parameters": dictionary of argument name and its value}
- 出参的工具解析: ?
- 出参格式: 返回的
- 评价:
- 扣分项:
- 入参:
Environment: ipython
- 入参:
Today Date
- 出参格式: 返回内容没有
<tool>
包裹,不方便结果的解析。
- 入参:
- 扣分项:
- 推荐指数: ⭐️⭐️
Hermes-3-Llama-3.1-405B ⭐️⭐️⭐️
<|begin_of_text|><|im_start|>system
You are a function calling AI model. You are provided with function signatures within <tools></tools> XML tags. You may call one or more functions to assist with the user query. Don't make assumptions about what values to plug into functions. Here are the available tools: <tools> {"type": "function", "function": {"name": "get_current_temperature", "description": "get_current_temperature(location: str) -> float - Get the current temperature at a location.
Args:
location(str): The location to get the temperature for, in the format "City, Country"
Returns:
The current temperature at the specified location in the specified units, as a float.", "parameters": {"type": "object", "properties": {"location": {"type": "string", "description": "The location to get the temperature for, in the format \"City, Country\""}}, "required": ["location"]}} </tools>Use the following pydantic model json schema for each tool call you will make: {"properties": {"name": {"title": "Name", "type": "string"}, "arguments": {"title": "Arguments", "type": "object"}}, "required": ["name", "arguments"], "title": "FunctionCall", "type": "object"}}
For each function call return a json object with function name and arguments within <tool_call></tool_call> XML tags as follows:
<tool_call>
{"name": <function-name>, "arguments": <args-dict>}
</tool_call><|im_end|>
<|im_start|>system
You are a bot that responds to weather queries.<|im_end|>
<|im_start|>user
Hey, what's the temperature in Paris right now?<|im_end|>
<|im_start|>assistant
为方便查看,以上prompt手动变化了缩进。
- 入参:
- tools格式: 支持的工具列表(
tools
) 是基于自定义格式的(类json-schema),详见 chat_template - tools在prompt中的位置: 额外增加了一个
system
轮,放在最前面。(用户设置的system
保持不变),这个额外的system轮告诉大模型可以使用tools中的工具,并且指定返回格式
- tools格式: 支持的工具列表(
- 出参: 返回的
respone
要求是<tool_call>
包裹的jsonreturn a json object with function name and arguments within <tool_call></tool_call> XML tags as follows: <tool_call>{"name": <function-name>, "arguments": <args-dict>}</tool_call>
- 评价:
- 扣分项: 入参的
tools格式
与json-schema的字段虽然一样,也类似json,但不是严格的json格式,通用性差。
- 扣分项: 入参的
- 推荐指数: ⭐️⭐️⭐️
llama4 ⭐️⭐️⭐️
<|begin_of_text|><|header_start|>system<|header_end|>
Environment: ipython
You are a bot that responds to weather queries.<|eot|><|header_start|>user<|header_end|>
Given the following functions, please respond with a JSON for a function call with its proper arguments that best answers the given prompt.
Respond in the format {"name": function name, "parameters": dictionary of argument name and its value}.Do not use variables.
{
"type": "function",
"function": {
"name": "get_current_temperature",
"description": "Get the current temperature at a location.",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The location to get the temperature for, in the format \"City, Country\""
}
},
"required": [
"location"
]
},
"return": {
"type": "number",
"description": "The current temperature at the specified location in the specified units, as a float."
}
}
}
Hey, what's the temperature in Paris right now?<|eot|><|header_start|>assistant<|header_end|>
跟 llama3.1差不多,只是少了date
,并且换了special_token
。(同样拼在第一个user轮)
- 评价: 同 llama3.1
- 推荐指数: ⭐️⭐️⭐️
mistralai/Ministral-8B-Instruct-2410
<s>[AVAILABLE_TOOLS][{"type": "function", "function": {"name": "get_current_temperature", "description": "Get the current temperature at a location.", "parameters": {"type": "object", "properties": {"location": {"type": "string", "description": "The location to get the temperature for, in the format \"City, Country\""}}, "required": ["location"]}}}][/AVAILABLE_TOOLS][INST]You are a bot that responds to weather queries.
Hey, what's the temperature in Paris right now?[/INST]
- 入参: 支持的工具列表(
tools
) 是基于 的, - 出参: 返回的
respone
要求是
qwen3 ⭐️⭐️⭐️⭐️⭐️
<|im_start|>system
You are a bot that responds to weather queries.
# Tools
You may call one or more functions to assist with the user query.
You are provided with function signatures within <tools></tools> XML tags:
<tools>
{"type": "function", "function": {"name": "get_current_temperature", "description": "Get the current temperature at a location.", "parameters": {"type": "object", "properties": {"location": {"type": "string", "description": "The location to get the temperature for, in the format \"City, Country\""}}, "required": ["location"]}, "return": {"type": "number", "description": "The current temperature at the specified location in the specified units, as a float."}}}
</tools>
For each function call, return a json object with function name and arguments within <tool_call></tool_call> XML tags:
<tool_call>
{"name": <function-name>, "arguments": <args-json-object>}
</tool_call><|im_end|>
<|im_start|>user
Hey, what's the temperature in Paris right now?<|im_end|>
<|im_start|>assistant
- 入参:
- tools格式: 支持的工具列表(
tools
) 是基于 json-schema 的. - tools在prompt中的位置: 拼接到原始
system
的结尾 + 额外的prompt (告诉大模型可以使用tools中的工具,并且指定返回格式)
- tools格式: 支持的工具列表(
- 出参: 返回的
respone
要求是<tool_call>
包裹的jsonreturn a json object with function name and arguments within <tool_call></tool_call> XML tags:\n<tool_call>\n{"name": <function-name>, "arguments": <args-json-object>}\n</tool_call>
- 推荐指数: ⭐️⭐️⭐️⭐️⭐️
deepseek-r1 ⭐️⭐️
<|begin▁of▁sentence|>You are a bot that responds to weather queries.<|User|>Hey, what's the temperature in Paris right now?<|Assistant|><think>
- 评价:
- 扣分项: 不支持tools,tool_calls也有问题
- 推荐指数: ⭐️⭐️
deepseek-v3.1 ⭐️⭐️
与 deepseek-r1 一样
gemma-3-27b-it ⭐️
不支持任何tool
grok-2 ⭐️
不支持任何tool