Update chat_template and tool_parser

#27
by rshcao - opened
  1. support arbitrary XLM-format tool keys definition;
  2. support more robust arrary/dict-type tool call arguments serialization;
  3. revise the tool_parser to cater to the changes in chat_template, and remove insafe eval operations.
cyente changed pull request status to merged

This parser is nearly identical to the one in vllm: https://github.com/vllm-project/vllm/blob/b7adf94c4a6c7290dd8765819da68a801008f5a1/vllm/entrypoints/openai/tool_parsers/qwen3coder_tool_parser.py

Is the one in this repo the recommended one to use?

If so, is there a way to load this parser when launching vllm?
--tool-parser-plugin "path/to/parser.py"

I'm just using --tool-call-parser qwen3_coder currently

We have made pulling request to the vLLM team, and actively urge to use exactly the same parser with this Huggingface repo (the parser in this HF repo is a newer version).

Before vLLM merging, you can also load this parser manually with vLLM using:

--enable-auto-tool-choice --tool-call-parser qwen3_coder --tool-parser-plugin "path/to/this/hf_repo/parser.py" --chat-template "path/to/this/hf_repo/chat_template.jinja"

Thank you!

Sign up or log in to comment