Join the conversation

Join the community of Machine Learners and AI enthusiasts.

Sign Up
bartowski 
posted an update 23 days ago
Post
10770
Old mixtral model quants may be broken!

Recently Slaren over on llama.cpp refactored the model loader - in a way that's super awesome and very powerful - but with it came breaking of support for "split tensor MoE models", which applies to older mixtral models

You may have seen my upload of one such older mixtral model, ondurbin/bagel-dpo-8x7b-v0.2, and with the newest changes it seems to be able to run without issue

If you happen to run into issues with any other old mixtral models, drop a link here and I'll try to remake them with the new changes so that we can continue enjoying them :)

Btw for anyone late to the game - Old Mixtral quants should still work as expected in KoboldCpp. The new ones should work as well, so you have multiple options.

In this post