question about context size

#2
by Clausss - opened

hello I have a question max_position_embeddings is a context size of model?

Hi, @Clausss !

Yes, for models where config.json has "model_type": "llama", max_position_embeddings indicates the maximum context size, meaning that it was trained with texts up to 2048 tokens long. If the generation surpasses this limit, the model will probably start outputting nonsense sentences.

Thanks you for fast response and very well explain 🤗

Clausss changed discussion status to closed

Sign up or log in to comment