Model Hallucination in Function Call

#22
by princepride - opened

I've encountered an issue while testing function calls with vLLM. Function calls aren't being triggered as expected. After inspecting the response, I noticed that instead of returning <tool_call></tool_call>, it returns <function_call></function_call>:

{'id': 'chatcmpl-aa7d39bb3d084ff5b4d82f5a87ef5951', 'object': 'chat.completion', 'created': 1747139480, 'model': 'Qwen/Qwen2.5-Coder-7B-Instruct', 'choices': [{'index': 0, 'message': {'role': 'assistant', 'reasoning_content': None, 'content': '```xml\n<function_call>\n    {"name": "get_weather", "arguments": {"location": "Seoul", "unit": "celsius"}}\n</function_call>\n```', 'tool_calls': []}, 'logprobs': None, 'finish_reason': 'stop', 'stop_reason': None}], 'usage': {'prompt_tokens': 203, 'total_tokens': 241, 'completion_tokens': 38, 'prompt_tokens_details': None}, 'prompt_logprobs': None, 'kv_transfer_params': None}

Here's the command I used:

vllm serve Qwen/Qwen2.5-Coder-7B-Instruct   --quantization gptq   --download-dir /models   --enable-auto-tool-choice   --tool-call-parser hermes

And here's my request code:

import requests
import json

url = "http://localhost:8000/v1/chat/completions"
headers = {'Content-Type': 'application/json'}
data = {
    "model": "Qwen/Qwen2.5-Coder-7B-Instruct",
    "messages": [
        {
            "role": "system",
            "content": "you are helpful ai"
        },
        {
            "role": "user",
            "content": "what is the weather in seoul?"
        }
    ],
    "stream": False,
    "tool_choice": "auto",
    "tools": [
        {
            "type": "function",
            "function": {
                "name": "get_weather",
                "description": "Get the current weather in a given location",
                "parameters": {
                    "type": "object",
                    "properties": {
                        "location": {
                            "type": "string",
                            "description": "City and state, e.g., 'San Francisco, CA'"
                        },
                        "unit": {
                            "type": "string",
                            "enum": [
                                "celsius",
                                "fahrenheit"
                            ]
                        }
                    },
                    "required": [
                        "location",
                        "unit"
                    ]
                }
            }
        }
    ]
}

try:
    response = requests.post(url, headers=headers, data=json.dumps(data))
    response.raise_for_status()  # Raise HTTPError for bad responses (4xx or 5xx)
    print("Request successful!")
    print("Response content:")
    print(response.json())
except requests.exceptions.RequestException as e:
    print(f"Request failed: {e}")
    if response is not None:
        print(f"Response status code: {response.status_code}")
        print(f"Response content: {response.text}")

Sign up or log in to comment