Error on both Exllamav2 and ExLllamav2_HF in ooba/tgw
#1
by
2themaxx
- opened
I get an error loading this model in the ooba/tgw 1-click install (ace8afb825c80925ed21ab26dbf66b538ab06285 commit) previous exl2 quants load fine such as "turboderp/gemma-3-27b-it-exl2" still load fine. The turboderp/Qwen3-32b-ExL3 from this ooba commit also loads fine.
...
line 483, in check_keys
raise ValueError(f" ## Could not find {prefix}.* in model")
ValueError: ## Could not find model.layers.0.mlp.down_proj.* in model