Qwen
/

Text Generation
GGUF
English
chat
conversational

BOS token set to endoftext

#8
by JohanAR - opened

When loading the model (at least q4_k_m version) llama.cpp says "BOS token = 151643 '<|endoftext|>'", is that correct?

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment