Tokenizer's `model_max_length` unusual value
#48
by
awni
- opened
In the tokenizer_config.json
for all the Gemma 3 models I see this value for the model's max length:
"model_max_length": 1000000000000000019884624838656,
Some downstream packages rely on this value to have a sensible default so it would be great to have it set to the actual max length of the model.
You can always set it to something else yourself. This value is a default placeholder meaning there is no max length. If you are using lm-evaluation-harness, it would see this value and set it's own default max length, which is something I have personally encountered.