mistral

Apriel R1P V.2

Day 2 RP finetune of Apriel 15B, with several iterative improvements from the first version. In particular, coherence at good temperatures (~.7) should be much higher.

I also fully converted the model to use the Phi 3 format; this comes at the slight tradeoff of the <|end|> tag not always tokenizing exactly the same way in a few niche scenarios.

Further attempts were made to fix formatting issues with asterisks on the base model.

NOTE: THIS IS THE THINKING VERSION

Upon further testing, I discovered that while merging back onto the instruct improved thinking mode, this came at the cost of degraded non-thinking outputs.

Use the non-thinking version instead if you want a standard model.

Thinking Mode

  • To enable thinking place /think in the system prompt and prefill <|think|>\n for thinking mode.

  • Phi esque thinking tags, <|think|> and <|/think|> have been added to the model.

  • Remember to reconfigure Sillytavern to parse the new think tags.

Settings

The chat template has been converted to a Phi 3 template as the model seemed to respond best to this format.

This model does prefer having character cards placed in user messages, not the system prompt.

Special Thanks:

Undi95 for portions of their dataset and inspiration.

PJMixers-Dev for their dataset curation and creation efforts.

Downloads last month
4
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for ConicCat/Apriel-R1PV.2-Thinking

Finetuned
(3)
this model
Quantizations
2 models

Datasets used to train ConicCat/Apriel-R1PV.2-Thinking

Collection including ConicCat/Apriel-R1PV.2-Thinking