What's the max context on this?

#9
by ThePabli - opened

Assuming 32k. Can't find any data, but this is extremely efficient. Lowest VRAM usage out of the hundreds of models that I've tested.

Z.ai & THUKEG org

Check our github.
32K -> 128K

zRzRzRzRzRzRzR changed discussion status to closed
Your need to confirm your account before you can post a new comment.

Sign up or log in to comment