Lack of `max_position_embeddings` in config.json

#17
by Zihao-Li - opened
config = AutoConfig.from_pretrained(model)
max_len = getattr(config, 'max_position_embeddings', None)
print(max_len)
None

Is it possible to add it?

Google org

Hi @Zihao-Li ,

Thanks for noting us. max_position_embeddings for an google/gemma-3-12b-it model is 131072. We will update it as soon as possible. For more information, could you please refer to this link

Thank you.

Sign up or log in to comment