Broken Tutu works well, but has a limited context window
#1235
by
PatrickMcHargue
- opened
I'm using broken Tutu, and some of the other Tutu builds, (Q8) to write prose. It does an excellent job as far as I'm concerned. However, I regularly bust the 32k context window cap. I've also used other of the fine-tunes you've published, but the Tutu builds seem to work better for me. (at least until I figure out how to fine-tune against a favorite author myself)
Would it be possible to publish a version of this with a larger context window? 64k would work, but I would prefer to have 128k.
Is that possible? If yes, I hope that you'll make this change.
We are not really publishing models, we just wuant them into a handier format for most people (gguf). You might want to ask this on the original model, linked from the model page.