Safetensors
qwen2
Hammer-7b / README.md
linqq9's picture
Update README.md
9c9bfe3 verified
|
raw
history blame
57.9 kB
---
license: cc-by-4.0
datasets:
- Salesforce/xlam-function-calling-60k
base_model: Qwen/Qwen2-7B-Instruct
---
# Hammer-7b Function Calling Model
## Introduction
Function calling enables LLMs to invoke specific functions, integrating external features, accessing real-world data, and extending beyond text generation. We present Hammer, a finetuned model based on Qwen2-7B-Instruct. Unlike previous works emphasizing on data refinement (cite xlam, IBM…), our focus is on applying novel training techniques to address recognized issues in existing function-calling models. Such issues are listed below:
1. Hallucination
- a) Function name hallucination: The model, rather than selecting from the provided function pool, has a tendency to generate a new function based on its own world knowledge.
- b) Parameter name hallucination: When the user fails to provide sufficient information to fulfill their request (lacking necessary parameters), the model is inclined to fill in the parameters relying on its own knowledge.
2. Overfitting
- a) Function name and parameter name: The model pays excessive attention to the function name and parameter name while neglecting other information such as description, input, and output. This leads to a lack of generalization and reduces the model's ability to handle diverse scenarios.
- b) Parameter filling: The model does not extract parameters based on the provided function definition. Instead, it fills in the parameters based on the learned knowledge from training. For instance, when expecting "San Francisco", "San Francisco, CA" might be filled in because in the training data, all "San Francisco"s are followed by "CA"s.
- c) Default value filling: The model fills in parameter default values according to patterns in the training data rather than the provided function definition. For example, when "default = inch" is most common in the training data, the model is likely to fill in "inch" instead of "cm", even though the latter is the provided default value in the function definition.
- d) Ordering of provided function list and parameter list: When the provided function list or parameter list have consistent orderings during training, it is possible that the model learns patterns that are not intended, such as remembering the orderings.
3. Instructions missing key information
Occasionally, user instructions may lack essential details vital for effective function execution. For instance, the command "Set an alarm to wake me up”, lacks a time specification. Ideally, in such instances, the model should either request additional information or merely output the function name, excluding the unspecified parameter. Existing methods either disregard such situations or output an “irrelevant” signal, indicating the query is unfulfillable with the given tools.
4. Prompt design
Inconsistency in instruction formatting between training and testing can result in a significant performance gap. For example, during the training phase, the default value is provided in the parameter description, while during testing, the default value is provided as a separate parameter in JSON format.
In this work, we focus on introducing function calling abilities with an inherent emphasis on addressing the aforementioned limitations. We summarize our techniques as follows:
1.Masking: We propose function/parameter mask technique, a dynamic data augmentation method. This approach enhances the model's focus on tool descriptions rather than the tool names within tool definitions. The masking operations include:
- a) Function Name Masking: Replacing the function name with a randomly generated string to ensure the model pays more attention to the function description rather than function names.
- b) Parameter Name Masking: Replacing the parameter name with a randomly generated string to ensure the model pays more attention to the parameter description rather than parameter names.
- c) Default Value Masking: Default values are replaced with random strings to prevent overfitting to specific values.
2.Function Shuffling: Random reordering of functions and parameters during training deters the model from memorizing their sequence.
3.Prompt Optimization: As our model concentrates on function/parameter descriptions, we incorporate default value information into those descriptions to boost performance at inference.
Addressing these multifaceted issues necessitates a refined and sophisticated approach to model training and optimization. To this end, we have meticulously developed an advanced function calling model through the fine-tuning of the *Qwen2-7B-instruct*. The ensuing sections provide a comprehensive overview of the methods and processes implemented during the training phase to mitigate these issues effectively:
1. **Data Extraction and Preparation**:
We extracted 7.5k sample data from *Salesforce/xlam-function-calling-60k* and removed the target tools from the candidate toolset to generate irrelevant data samples. This data was mixed with 60k XLAM data samples for training.
2. **Fine Tuning**:
Our fine-tuning process primarily leveraged the Low-Rank Adaptation (LoRA) technique, incorporating specific hyperparameters and strategies to ensure optimal model performance.
Masking And Function Shuffling Technique was used during the training process;The Training Setup is as follows:
- **LoRA Rank**: 32
- **Learning Rate**: 5e-5
- **Warmup Steps**: 100
- **LR Scheduler Type**: Cosine
- **Batch Size**: 4
- **Gradient Accumulation Steps**: 2
- **Hardware**: 4x A100 (80G) GPUs
3. **Inference**:
During inference, since our model focuses more on function/parameter descriptions, we added default value information in parameter descriptions to obtain better performance.
## Supported Function Calling Types
The model is capable of handling various function calling scenarios. Here, the supported types are classified based on the nature of inputs and outputs:
### Input Types
- 1. **Single Function Input**
- 2. **Multiple Functions Input**
### Output Types
- 1. **Simple Function Calling**
- 2. **Parallel Function Calling**
- 3. **Irrelevance**
- 4. **Relevance**
By categorizing function calling types based on inputs and outputs, our model provides robust support for a wide range of function calling scenarios, ensuring both flexibility and precision in handling diverse tasks.
## Performance
1. First, we evaluate our model on the Berkeley Function-Calling Leaderboard (BFCL), and the performance is as follows:
<style type="text/css">
.tg {border-collapse:collapse;border-spacing:0;}
.tg td{border-color:black;border-style:solid;border-width:1px;font-family:Arial, sans-serif;font-size:14px;
overflow:hidden;padding:10px 5px;word-break:normal;}
.tg th{border-color:black;border-style:solid;border-width:1px;font-family:Arial, sans-serif;font-size:14px;
font-weight:normal;overflow:hidden;padding:10px 5px;word-break:normal;}
.tg .tg-9id2{color:#007BFF;text-align:center;vertical-align:middle}
.tg .tg-pchv{color:#212529;font-weight:bold;text-align:center;vertical-align:middle}
.tg .tg-qai4{color:#212529;text-align:center;vertical-align:middle}
.tg .tg-p59o{color:#00E;text-align:center;text-decoration:underline;vertical-align:top}
</style>
<table class="tg"><thead>
<tr>
<th class="tg-pchv"><span style="font-weight:700;font-style:normal;text-decoration:none;color:#212529">Rank</span></th>
<th class="tg-pchv"><span style="font-weight:700;font-style:normal;text-decoration:none;color:#212529">Overall</span> <span style="font-weight:700;font-style:normal;text-decoration:none;color:#212529">Acc</span></th>
<th class="tg-pchv"><span style="font-weight:700;font-style:normal;text-decoration:none;color:#212529">Model</span></th>
<th class="tg-pchv"><span style="font-weight:700;font-style:normal;text-decoration:none;color:#212529">AST</span> <span style="font-weight:700;font-style:normal;text-decoration:none;color:#212529">Summary</span></th>
<th class="tg-pchv"><span style="font-weight:700;font-style:normal;text-decoration:none;color:#212529">Exec</span> <span style="font-weight:700;font-style:normal;text-decoration:none;color:#212529">Summary</span></th>
<th class="tg-pchv"><span style="font-weight:700;font-style:normal;text-decoration:none;color:#212529">Irrelevance</span></th>
<th class="tg-pchv"><span style="font-weight:700;font-style:normal;text-decoration:none;color:#212529">Relevance</span></th>
<th class="tg-pchv"><span style="font-weight:700;font-style:normal;text-decoration:none;color:#212529">Organization</span></th>
<th class="tg-pchv"><span style="font-weight:700;font-style:normal;text-decoration:none;color:#212529">License</span></th>
</tr>
</thead>
<tbody>
<tr>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">1</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">85.79</span></td>
<td class="tg-p59o"><a href="https://platform.openai.com/docs/models/gpt-4-and-gpt-4-turbo">GPT-4-0125-Preview (Prompt)</a></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">85.5</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">89.25</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">61.35</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">97.56</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">OpenAI</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">Proprietary</span></td>
</tr>
<tr>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">2</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">85</span></td>
<td class="tg-p59o"><a href="https://platform.openai.com/docs/models/gpt-4-and-gpt-4-turbo">GPT-4-1106-Preview (Prompt)</a></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">86.31</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">87.38</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">64.98</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">90.24</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">OpenAI</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">Proprietary</span></td>
</tr>
<tr>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">3</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">84.74</span></td>
<td class="tg-p59o"><a href="https://platform.openai.com/docs/models/gpt-4-and-gpt-4-turbo">GPT-4-0613 (Prompt)</a></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">84.66</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">87.57</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">75.57</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">82.93</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">OpenAI</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">Proprietary</span></td>
</tr>
<tr>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">4</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">83.92</span></td>
<td class="tg-9id2"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#007BFF">Hammer-7b</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">78.7</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">89.71</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">72.87</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">92.68</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">MadeAgents</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">cc-by-nc-4.0</span></td>
</tr>
<tr>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">5</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">83.89</span></td>
<td class="tg-p59o"><a href="https://platform.openai.com/docs/models/gpt-4-and-gpt-4-turbo">GPT-4-turbo-2024-04-09 (Prompt)</a></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">85.41</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">88.12</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">61.82</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">82.93</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">OpenAI</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">Proprietary</span></td>
</tr>
<tr>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">6</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">83.35</span></td>
<td class="tg-p59o"><a href="https://openai.com/index/gpt-4o-mini-advancing-cost-efficient-intelligence/">GPT-4o-mini-2024-07-18 (Prompt)</a></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">80.51</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">87.95</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">79.2</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">80.49</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">OpenAI</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">Proprietary</span></td>
</tr>
<tr>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">7</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">83.13</span></td>
<td class="tg-p59o"><a href="https://openai.com/index/hello-gpt-4o/">GPT-4o-2024-05-13 (Prompt)</a></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">83.83</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">85.12</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">77.44</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">78.05</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">OpenAI</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">Proprietary</span></td>
</tr>
<tr>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">8</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">82.55</span></td>
<td class="tg-p59o"><a href="https://huggingface.co/meetkai/functionary-medium-v3.1">Functionary-Medium-v3.1 (FC)</a></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">81.06</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">89.32</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">73.23</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">70.73</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">MeetKai</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">MIT</span></td>
</tr>
<tr>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">9</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">81.78</span></td>
<td class="tg-p59o"><a href="https://platform.openai.com/docs/models/gpt-4-and-gpt-4-turbo">GPT-4-1106-Preview (FC)</a></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">77.95</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">87.61</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">72.7</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">82.93</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">OpenAI</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">Proprietary</span></td>
</tr>
<tr>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">10</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">81.59</span></td>
<td class="tg-p59o"><a href="https://llama.meta.com/llama3">Meta-Llama-3-70B-Instruct (Prompt)</a></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">80.15</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">88.04</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">50.47</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">92.68</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">Meta</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">Meta</span> <span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">Llama</span> <span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">3</span> <span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">Community</span></td>
</tr>
<tr>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">11</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">80.88</span></td>
<td class="tg-p59o"><a href="https://www.anthropic.com/news/claude-3-family">Claude-3-Opus-20240229 (Prompt)</a></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">79.42</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">87.39</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">56.15</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">85.37</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">Anthropic</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">Proprietary</span></td>
</tr>
<tr>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">12</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">80.87</span></td>
<td class="tg-p59o"><a href="https://platform.openai.com/docs/models/gpt-4-and-gpt-4-turbo">GPT-4-0125-Preview (FC)</a></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">77.02</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">85.3</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">74.03</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">85.37</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">OpenAI</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">Proprietary</span></td>
</tr>
<tr>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">13</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">80.23</span></td>
<td class="tg-p59o"><a href="https://huggingface.co/nvidia/nemotron-4-340b-instruct">Nemotron-4-340b-instruct (Prompt)</a></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">76.67</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">83.38</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">84.1</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">78.05</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">NVIDIA</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">nvidia-open-model-license</span></td>
</tr>
<tr>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">14</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">80.21</span></td>
<td class="tg-p59o"><a href="https://huggingface.co/meetkai/functionary-small-v3.1">Functionary-Small-v3.1 (FC)</a></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">78.64</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">83.45</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">68.36</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">85.37</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">MeetKai</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">MIT</span></td>
</tr>
<tr>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">15</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">79.66</span></td>
<td class="tg-p59o"><a href="https://mistral.ai/news/mistral-large-2407/">mistral-large-2407 (FC Any)</a></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">85.61</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">88.45</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">0.34</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">100</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">Mistral</span> <span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">AI</span></td>
<td class="tg-qai4"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#212529">Proprietary</span></td>
</tr>
</tbody></table>
*Note: The rankings are based on the performance metrics provided.*
2.In our evaluation, we assessed the function calling capabilities of various models, including our own fine-tuned models using both masked and non-masked approaches. Below are the results across several benchmarks, derived from evaluations performed in a zero-shot manner. Our model, **hammer-7b**, demonstrated superior performance compared to other models.
The table below replicates and extends the format found in ["Granite-Function Calling Model"](https://arxiv.org/abs/2407.00121), particularly Table 6: Function Calling Academic Benchmarks.
<style type="text/css">
.tg {border-collapse:collapse;border-spacing:0;}
.tg td{border-color:black;border-style:solid;border-width:1px;font-family:Arial, sans-serif;font-size:14px;
overflow:hidden;padding:12px 5px;word-break:normal;}
.tg th{border-color:black;border-style:solid;border-width:1px;font-family:Arial, sans-serif;font-size:14px;
font-weight:normal;overflow:hidden;padding:12px 5px;word-break:normal;}
.tg .tg-baqh{text-align:center;vertical-align:top}
.tg .tg-7geq{background-color:#ffffc7;text-align:center;vertical-align:top}
.tg .tg-k5c1{background-color:#ffffc7;font-weight:bold;text-align:center;vertical-align:top}
.tg .tg-nrix{text-align:center;vertical-align:middle}
.tg .tg-amwm{font-weight:bold;text-align:center;vertical-align:top}
</style>
<table class="tg"><thead>
<tr>
<th class="tg-nrix" rowspan="2"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">Model</span></th>
<th class="tg-nrix" rowspan="2"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">Size</span></th>
<th class="tg-baqh" colspan="2"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">API-Bank</span> <span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">L-1</span></th>
<th class="tg-baqh" colspan="2"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">API-Bank</span> <span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">L-2</span></th>
<th class="tg-baqh" colspan="2"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">Tool-Alpaca</span></th>
<th class="tg-baqh" colspan="2"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">Nexus</span> <span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">Raven</span></th>
<th class="tg-baqh" colspan="2"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">F1</span> <span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">Average</span></th>
</tr>
<tr>
<th class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">F1</span> <span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">Func-Name</span></th>
<th class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">F1</span> <span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">Args</span></th>
<th class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">F1</span> <span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">Func-Name</span></th>
<th class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">F1</span> <span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">Args</span></th>
<th class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">F1</span> <span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">Func-Name</span></th>
<th class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">F1</span> <span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">Args</span></th>
<th class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">F1</span> <span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">Func-Name</span></th>
<th class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">F1</span> <span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">Args</span></th>
<th class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">F1</span> <span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">Func-Name</span></th>
<th class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">F1</span> <span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">Args</span></th>
</tr>
</thead>
<tbody>
<tr>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">Functionary-small-v2.4</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">7B</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">78.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">70.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">54.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">45.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">88.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">47.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">82.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">64.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">75.50%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">56.50%</span></td>
</tr>
<tr>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">Gorilla-openfunctions-v2</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">7B</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">43.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">41.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">12.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">12.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">69.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">39.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">81.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">65.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">51.20%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">39.30%</span></td>
</tr>
<tr>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">Hermes-2-Pro-Mistral</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">7B</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">93.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">77.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">54.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">25.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">80.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">26.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">90.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">63.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">79.30%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">47.80%</span></td>
</tr>
<tr>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">Mistral-Instruct-v0.3</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">7B</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">79.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">69.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">69.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">46.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">33.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">33.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">71.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">54.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">63.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">50.50%</span></td>
</tr>
<tr>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">CodeGemma-Instruct</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">7B</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">77.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">57.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">59.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">38.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">59.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">31.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">84.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">68.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">69.80%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">48.50%</span></td>
</tr>
<tr>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">Nexusflow-Raven-v2</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">13B</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">51.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">42.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">28.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">22.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">85.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">37.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">92.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">75.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">64.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">44.00%</span></td>
</tr>
<tr>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">C4AI-Command-R-v01</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">35B</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">93.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">76.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">77.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">54.00%</span></td>
<td class="tg-amwm"><span style="font-weight:700;font-style:normal;text-decoration:none;color:#000">90.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">42.00%</span></td>
<td class="tg-amwm"><span style="font-weight:700;font-style:normal;text-decoration:none;color:#000">93.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">71.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">88.30%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">60.80%</span></td>
</tr>
<tr>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">Meta-Llama-3-70B-Instruct</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">70B</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">85.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">67.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">69.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">52.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">78.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">43.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">70.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">52.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">75.50%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">53.50%</span></td>
</tr>
<tr>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">GRANITE-20B-FUNCTIONCALLING</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">20B</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">91.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">71.00%</span></td>
<td class="tg-amwm"><span style="font-weight:700;font-style:normal;text-decoration:none;color:#000">83.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">60.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">89.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">44.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">92.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">72.00%</span></td>
<td class="tg-amwm"><span style="font-weight:700;font-style:normal;text-decoration:none;color:#000">88.80%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">61.80%</span></td>
</tr>
<tr>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">xlam-7b-fc-r</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">7B</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">90.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">80.70%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">68.90%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">60.70%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">67.30%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">59.00%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">54.10%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">57.50%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">70.10%</span></td>
<td class="tg-baqh"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">64.50%</span></td>
</tr>
<tr>
<td class="tg-7geq"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">Hammer-7b</span></td>
<td class="tg-7geq"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">7B</span></td>
<td class="tg-k5c1"><span style="font-weight:700;font-style:normal;text-decoration:none;color:#000">93.80%</span></td>
<td class="tg-k5c1"><span style="font-weight:700;font-style:normal;text-decoration:none;color:#000">85.90%</span></td>
<td class="tg-7geq"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">79.20%</span></td>
<td class="tg-7geq"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">64.40%</span></td>
<td class="tg-7geq"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">82.30%</span></td>
<td class="tg-k5c1"><span style="font-weight:700;font-style:normal;text-decoration:none;color:#000">59.90%</span></td>
<td class="tg-7geq"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">92.50%</span></td>
<td class="tg-k5c1"><span style="font-weight:700;font-style:normal;text-decoration:none;color:#000">77.40%</span></td>
<td class="tg-7geq"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">86.90%</span></td>
<td class="tg-k5c1"><span style="font-weight:700;font-style:normal;text-decoration:none;color:#000">71.90%</span></td>
</tr>
</tbody></table>
3.Finally, we evaluate the performance of our model on the [Seal-Tools](https://arxiv.org/abs/2405.08355) dataset, which also achieves better performance.
<style type="text/css">
.tg {border-collapse:collapse;border-spacing:0;}
.tg td{border-color:black;border-style:solid;border-width:1px;font-family:Arial, sans-serif;font-size:14px;
overflow:hidden;padding:12px 5px;word-break:normal;}
.tg th{border-color:black;border-style:solid;border-width:1px;font-family:Arial, sans-serif;font-size:14px;
font-weight:normal;overflow:hidden;padding:12px 5px;word-break:normal;}
.tg .tg-9wq8{border-color:inherit;text-align:center;vertical-align:middle}
.tg .tg-c3ow{border-color:inherit;text-align:center;vertical-align:top}
.tg .tg-7btt{border-color:inherit;font-weight:bold;text-align:center;vertical-align:top}
.tg .tg-mfhl{background-color:#ffffc7;border-color:inherit;text-align:center;vertical-align:top}
.tg .tg-py60{background-color:#ffffc7;border-color:inherit;font-weight:bold;text-align:center;vertical-align:top}
</style>
<table class="tg"><thead>
<tr>
<th class="tg-9wq8" rowspan="2"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">Model</span></th>
<th class="tg-9wq8" rowspan="2"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">Size</span></th>
<th class="tg-c3ow" colspan="2"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">SealTool(Single-Tool)</span></th>
</tr>
<tr>
<th class="tg-c3ow"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">F1</span> <span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">Func-Name</span></th>
<th class="tg-c3ow"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">F1</span> <span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">Args</span></th>
</tr></thead>
<tbody>
<tr>
<td class="tg-c3ow"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">Gorilla-openfunctions-v2</span></td>
<td class="tg-c3ow"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">7B</span></td>
<td class="tg-c3ow"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">93.20%</span></td>
<td class="tg-c3ow"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">91.10%</span></td>
</tr>
<tr>
<td class="tg-c3ow"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">GRANITE-20B-FUNCTIONCALLING</span></td>
<td class="tg-c3ow"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">20B</span></td>
<td class="tg-c3ow"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">94.90%</span></td>
<td class="tg-7btt"><span style="font-weight:700;font-style:normal;text-decoration:none;color:#000">92.70%</span></td>
</tr>
<tr>
<td class="tg-c3ow"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">xlam-7b-fc-r</span></td>
<td class="tg-c3ow"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">7B</span></td>
<td class="tg-c3ow"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">79.00%</span></td>
<td class="tg-c3ow"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">76.90%</span></td>
</tr>
<tr>
<td class="tg-mfhl"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">Hammer-7b</span></td>
<td class="tg-mfhl"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">7B</span></td>
<td class="tg-py60"><span style="font-weight:700;font-style:normal;text-decoration:none;color:#000">97.40%</span></td>
<td class="tg-mfhl"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000">91.70%</span></td>
</tr>
</tbody></table>
## Upcoming Developments
We are actively working on preparing smaller models derived from this architecture, which will be open-sourced soon.
## Example Usage
This is a simple example of how to use our model.
~~~python
import json
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "MadeAgents/Hammer-7b"
model = AutoModelForCausalLM.from_pretrained(model_name, device_map="auto", torch_dtype="auto", trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained(model_name)
# Please use our provided instruction prompt for best performance
TASK_INSTRUCTION = """You are a tool calling assistant. In order to complete the user's request, you need to select one or more appropriate tools from the following tools and fill in the correct values for the tool parameters. Your specific tasks are:
1. Make one or more function/tool calls to meet the request based on the question.
2. If none of the function can be used, point it out and refuse to answer.
3. If the given question lacks the parameters required by the function, also point it out.
"""
FORMAT_INSTRUCTION = """
The output MUST strictly adhere to the following JSON format, and NO other text MUST be included.
The example format is as follows. Please make sure the parameter type is correct. If no function call is needed, please directly output an empty list '[]'
```
[
{"name": "func_name1", "arguments": {"argument1": "value1", "argument2": "value2"}},
... (more tool calls as required)
]
```
"""
# Define the input query and available tools
query = "Where can I find live giveaways for beta access and games? And what's the weather like in New York, US?"
live_giveaways_by_type = {
"name": "live_giveaways_by_type",
"description": "Retrieve live giveaways from the GamerPower API based on the specified type.",
"parameters": {
"type": "object",
"properties": {
"type": {
"type": "string",
"description": "The type of giveaways to retrieve (e.g., game, loot, beta).",
"default": "game"
}
},
"required": ["type"]
}
}
get_current_weather={
"name": "get_current_weather",
"description": "Get the current weather",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA"
}
},
"required": ["location"]
}
}
get_stock_price={
"name": "get_stock_price",
"description": "Retrieves the current stock price for a given ticker symbol. The ticker symbol must be a valid symbol for a publicly traded company on a major US stock exchange like NYSE or NASDAQ. The tool will return the latest trade price in USD. It should be used when the user asks about the current or most recent price of a specific stock. It will not provide any other information about the stock or company.",
"parameters": {
"type": "object",
"properties": {
"ticker": {
"type": "string",
"description": "The stock ticker symbol, e.g. AAPL for Apple Inc."
}
},
"required": ["ticker"]
}
}
def convert_to_format_tool(tools):
''''''
if isinstance(tools, dict):
format_tools = {
"name": tools["name"],
"description": tools["description"],
"parameters": tools["parameters"].get("properties", {}),
}
required = tools["parameters"].get("required", [])
for param in required:
format_tools["parameters"][param]["required"] = True
for param in format_tools["parameters"].keys():
if "default" in format_tools["parameters"][param]:
default = format_tools["parameters"][param]["default"]
format_tools["parameters"][param]["description"]+=f"default is \'{default}\'"
return format_tools
elif isinstance(tools, list):
return [convert_to_format_tool(tool) for tool in tools]
else:
return tools
# Helper function to build the input prompt for our model
def build_prompt(task_instruction: str, format_instruction: str, tools: list, query: str):
prompt = f"[BEGIN OF TASK INSTRUCTION]\n{task_instruction}\n[END OF TASK INSTRUCTION]\n\n"
prompt += f"[BEGIN OF AVAILABLE TOOLS]\n{json.dumps(tools)}\n[END OF AVAILABLE TOOLS]\n\n"
prompt += f"[BEGIN OF FORMAT INSTRUCTION]\n{format_instruction}\n[END OF FORMAT INSTRUCTION]\n\n"
prompt += f"[BEGIN OF QUERY]\n{query}\n[END OF QUERY]\n\n"
return prompt
# Build the input and start the inference
openai_format_tools = [live_giveaways_by_type, get_current_weather,get_stock_price]
format_tools = convert_to_format_tool(openai_format_tools)
content = build_prompt(TASK_INSTRUCTION, FORMAT_INSTRUCTION, format_tools, query)
messages=[
{ 'role': 'user', 'content': content}
]
inputs = tokenizer.apply_chat_template(messages, add_generation_prompt=True, return_tensors="pt").to(model.device)
# tokenizer.eos_token_id is the id of <|EOT|> token
outputs = model.generate(inputs, max_new_tokens=512, do_sample=False, num_return_sequences=1, eos_token_id=tokenizer.eos_token_id)
print(tokenizer.decode(outputs[0][len(inputs[0]):], skip_special_tokens=True))
~~~
---
## References
- 1.Yan F, Mao H, Ji C C-J, et al. Berkeley Function Calling Leaderboard.
- 2. Abdelaziz I, Basu K, Agarwal M, et al. Granite-Function Calling Model: Introducing Function Calling Abilities via Multi-task Learning of Granular Tasks[J]. arXiv preprint arXiv:2407.00121, 2024.
- 3. Wu M, Zhu T, Han H, et al. Seal-Tools: Self-Instruct Tool Learning Dataset for Agent Tuning and Detailed Benchmark[J]. arXiv preprint arXiv:2405.08355, 2024.
Feel free to reach out for further clarifications or contributions!