DraftReasoner-2x7B-MoE-v0.1

Experimental 2-expert MoE merge using mlabonne/Marcoro14-7B-slerp as base.

Notes

Please evaluate before use in any application pipeline. Activation for Math part of the model would be 'math', 'reason', 'solve', 'count'.

Downloads last month
15
Safetensors
Model size
12.9B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support