output issue

#14
by mobo68 - opened

Hi,

I deployed the model using the Docker image of vLLM 0.9.1 with the following parameters:

--tokenizer-mode mistral --config-format mistral --load-format mistral --tool-call-parser mistral --enable-auto-tool-choice

However, I'm getting some strange characters in the output, such as Chinese characters or gibberish words.

Am I missing something?
Thanks!

Sign up or log in to comment