---
license: mit
datasets:
- NousResearch/hermes-function-calling-v1
base_model: microsoft/Phi-3.5-mini-instruct
---
# phi3.5-phunction-calling Model Card
## Model Overview
**Model Name:** phi3.5-phunction-calling
**Description:** This model is a fine-tuned version of the phi3.5 model, specifically designed for function calling tasks. It has been optimized to understand and execute function calls accurately and efficiently.
## Intended Use
**Primary Use Case:** This model is intended for use in applications where function calling is a critical component, such as automated assistants, code generation, and API interaction.
**Limitations:** While the model is highly accurate, it may still produce errors or misunderstandings in complex or ambiguous function calls.
```py
from unsloth.chat_templates import get_chat_template
tokenizer = get_chat_template(
tokenizer,
chat_template = "phi-3", # Supports zephyr, chatml, mistral, llama, alpaca, vicuna, vicuna_old, unsloth
mapping = {"role" : "from", "content" : "value", "user" : "human", "assistant" : "gpt"}, # ShareGPT style
)
FastLanguageModel.for_inference(model) # Enable native 2x faster inference
messages = [
{"from": "system", "value": "You are a function calling AI model. You are provided with function signatures within XML tags. You may call one or more functions to assist with the user query. Don't make assumptions about what values to plug into functions.\n\n[{'type': 'function', 'function': {'name': 'search_recipes', 'description': 'Searches for recipes based on given criteria.', 'parameters': {'type': 'object', 'properties': {'cuisine': {'type': 'string', 'description': 'The type of cuisine to search for.'}, 'dietary_restriction': {'type': 'string', 'description': 'Any dietary restrictions to consider.', 'enum': ['vegetarian', 'vegan', 'gluten-free', 'none']}}, 'required': ['cuisine']}}}, {'type': 'function', 'function': {'name': 'get_recipe_details', 'description': 'Retrieves detailed information about a specific recipe.', 'parameters': {'type': 'object', 'properties': {'recipe_id': {'type': 'string', 'description': 'The unique identifier for the recipe.'}}, 'required': ['recipe_id']}}}, {'type': 'function', 'function': {'name': 'calculate_nutrition', 'description': 'Calculates nutritional information for a given recipe.', 'parameters': {'type': 'object', 'properties': {'recipe_id': {'type': 'string', 'description': 'The unique identifier for the recipe.'}, 'serving_size': {'type': 'integer', 'description': 'The number of servings to calculate nutrition for.', 'default': 1}}, 'required': ['recipe_id']}}}]\n\nFor each function call return a json object with function name and arguments within tags with the following schema:\n\n{'arguments': , 'name': }\n\n"},
{"from": "human", "value": "I'm planning a dinner party and I'm looking for some Italian recipes to try. Can you help me find some vegetarian Italian dishes? Once we have a list, I'd like to get more details about the first recipe in the search results. Finally, I want to calculate the nutritional information for that recipe, assuming I'm cooking for 4 people. Can you please perform these tasks for me?"},
]
inputs = tokenizer.apply_chat_template(
messages,
tokenize = True,
add_generation_prompt = True, # Must add for generation
return_tensors = "pt",
).to("cuda")
outputs = model.generate(input_ids = inputs, max_new_tokens = 256, use_cache = True)
tokenizer.batch_decode(outputs)
```