broken
#1
by
mradermacher
- opened
Hi, these ggufs are broken and crash llama.cpp. The same happened ot me when I made my ggufs. The reasin is a broken tokenizer in the original model - possibly it uses a bpe vocabulary and has a wrong tokenizer.model. If that is the case, deleting the broken tokenizer.model might fix it.
thank you for pointing this out,
for some reason the full(non-gguf model) it was still eval-ing but very very low scores 25% of what it should be, strange
nisten
changed discussion status to
closed