File size: 2,446 Bytes
75a5fd7 8baf15d 1723907 75a5fd7 1723907 75a5fd7 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 |
---
base_model:
- Phr00t/Phr00tyMix-v2-32B
- huihui-ai/DeepSeek-R1-Distill-Qwen-32B-abliterated
- trashpanda-org/QwQ-32B-Snowdrop-v0
- allura-org/Qwen2.5-32b-RP-Ink
- Delta-Vector/Archaeo-32B-KTO
- arcee-ai/Virtuoso-Medium-v2
library_name: transformers
tags:
- mergekit
- merge
---
This model has been replaced by: [Phr00tyMix v4](https://huggingface.co/Phr00t/Phr00tyMix-v4-32B)
WARNING: This model is slightly incoherent and v4 should be used instead.
# Phr00tyMix-v3-32B

After many, many failed attempts... I finally got the Phr00tyMix v3 I was looking for! I find this more creative and spicy, while perhaps increasing smarts and obediency over Phr00tyMix v2 too (edit: mixed bag, use v4 instead). If you ask it to be uncensored, it will be.
The goal: smart, obedient, creative, spicy, uncensored and coherent -- my initial testing shows this tops all of my previous mixes.
I recommend a high temperature (1.5) and a high MinP (0.1), but play around as you wish.
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the [Model Stock](https://arxiv.org/abs/2403.19522) merge method using [Phr00t/Phr00tyMix-v2-32B](https://huggingface.co/Phr00t/Phr00tyMix-v2-32B) as a base.
### Models Merged
The following models were included in the merge:
* [huihui-ai/DeepSeek-R1-Distill-Qwen-32B-abliterated](https://huggingface.co/huihui-ai/DeepSeek-R1-Distill-Qwen-32B-abliterated)
* [trashpanda-org/QwQ-32B-Snowdrop-v0](https://huggingface.co/trashpanda-org/QwQ-32B-Snowdrop-v0)
* [allura-org/Qwen2.5-32b-RP-Ink](https://huggingface.co/allura-org/Qwen2.5-32b-RP-Ink)
* [Delta-Vector/Archaeo-32B-KTO](https://huggingface.co/Delta-Vector/Archaeo-32B-KTO)
* [arcee-ai/Virtuoso-Medium-v2](https://huggingface.co/arcee-ai/Virtuoso-Medium-v2)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
merge_method: model_stock
base_model: Phr00t/Phr00tyMix-v2-32B
dtype: bfloat16
models:
- model: huihui-ai/DeepSeek-R1-Distill-Qwen-32B-abliterated
- model: Delta-Vector/Archaeo-32B-KTO
- model: trashpanda-org/QwQ-32B-Snowdrop-v0
- model: allura-org/Qwen2.5-32b-RP-Ink
- model: arcee-ai/Virtuoso-Medium-v2
tokenizer:
source: "Delta-Vector/Archaeo-32B-KTO"
```
|