Real model context length?

#3
by gregporter585 - opened

In your model card, you advertise Context length of 131,072 tokens. However, the model is limited to 32,768 as defined my max_position_embeddings in the config.json.

I noticed rope_scaling was null. Is RoPE scalling supported in order to actually use the full 131,072 tokens? If so, what factor do you suggest using?

Sign up or log in to comment