hah, this is pretty good... but it is using that deepseek long winded reasoning.

going to try a modified dataset with some sort of "concise" reasoning.

Downloads last month
86
Safetensors
Model size
12.2B params
Tensor type
BF16
·
F16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for SuperbEmphasis/Mistral-Nemo-R1-ERP-Reasoning

Dataset used to train SuperbEmphasis/Mistral-Nemo-R1-ERP-Reasoning