Context Length?

#2
by lazyDataScientist - opened

From the config file it looks like it is 24k, but just want to confirm .

It is trained on Llama-3-Soliloquy-8B
Which claims:

Trained on over 250 million tokens of roleplaying data, Soliloquy-L3 has a vast knowledge base, rich literary expression, and support for up to 24k context length.

It should then have decent long context support. Though i can't give any specific details.

Sign up or log in to comment