Update README.md
Browse files
README.md
CHANGED
@@ -22,7 +22,9 @@ library_name: transformers
|
|
22 |
[huihui-ai/Qwen2.5-Coder-32B-Instruct-abliterated](https://huggingface.co/huihui-ai/Qwen2.5-Coder-32B-Instruct-abliterated).
|
23 |
The weights are blended in a 9:1 ratio, with 90% of the weights from the abliterated QwQ-32B-Preview-abliterated and 10% from the abliterated Qwen2.5-Coder-32B-Instruct-abliterated model.
|
24 |
**Although it's a simple mix, the model is usable, and no gibberish has appeared**.
|
25 |
-
This is an experiment.
|
|
|
|
|
26 |
Now the effective ratios are 9:1, 8:2, and 7:3. Any other ratios (6:4,5:5) would result in mixed or unclear expressions.
|
27 |
|
28 |
## Model Details
|
|
|
22 |
[huihui-ai/Qwen2.5-Coder-32B-Instruct-abliterated](https://huggingface.co/huihui-ai/Qwen2.5-Coder-32B-Instruct-abliterated).
|
23 |
The weights are blended in a 9:1 ratio, with 90% of the weights from the abliterated QwQ-32B-Preview-abliterated and 10% from the abliterated Qwen2.5-Coder-32B-Instruct-abliterated model.
|
24 |
**Although it's a simple mix, the model is usable, and no gibberish has appeared**.
|
25 |
+
This is an experiment. I test the [9:1](https://huggingface.co/huihui-ai/QwQ-32B-Coder-Fusion-9010),
|
26 |
+
[8:2](https://huggingface.co/huihui-ai/QwQ-32B-Coder-Fusion-8020),
|
27 |
+
and [7:3](https://huggingface.co/huihui-ai/QwQ-32B-Coder-Fusion-7030) ratios separately to see how much impact they have on the model.
|
28 |
Now the effective ratios are 9:1, 8:2, and 7:3. Any other ratios (6:4,5:5) would result in mixed or unclear expressions.
|
29 |
|
30 |
## Model Details
|