not end at tokenizer.eos_token_id

#3
by quanyingxiu - opened

The 9B model successfully stopped at eos_token_id=151645, but the 83B model didn't stop and continued generating until reaching max_new_tokens.

I tried both eos_token_id=151645 and eos_token_id=tokenizer.eos_token_id, but the issue persists.

How can I fix this?

Sign up or log in to comment