The chat template is still broken
See https://www.reddit.com/r/LocalLLaMA/comments/1klltt4/the_qwen3_chat_template_is_still_bugged/
Under OpenAI specs, assistant history messages with tool calls do not contain "content", but the template crashes when such a history is provided. This error has actually made it all the way (and can be reproduced) on the OpenRouter playground with the Qwen3 models.
See https://www.reddit.com/r/LocalLLaMA/comments/1klltt4/the_qwen3_chat_template_is_still_bugged/
Under OpenAI specs, assistant history messages with tool calls do not contain "content", but the template crashes when such a history is provided. This error has actually made it all the way (and can be reproduced) on the OpenRouter playground with the Qwen3 models.
We're converting ATM and will reupload