File size: 1,962 Bytes
ca04316
 
 
 
 
 
 
2e022bf
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
ca04316
dd9d30c
2e022bf
 
ca04316
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
2e022bf
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
---
base_model:
- Sao10K/14B-Qwen2.5-Kunou-v1
library_name: transformers
tags:
- mergekit
- merge
- roleplay
- story-writing
- adventure
- gemma-2
- rp
- nsfw
language:
- en
- zh
- ja
- fr
- ko
- de
- ru
- es
- pt
---
| <img style="float:left;margin-right:0.4em" src="https://qu.ax/rjHgn.webp"> **For RP & story gen,<br/>no one expected Qwen2.5-14B to be that elegant. Fine-tunings fought to be the cherry on top, till Qwen3 melts the cake itself.<br/>The Oscar winner was contented before being thrown onto the real battlefield.<br/>Words say her wounds are dramatic,<br/>overall too charming to be breathing.<br/><br/>I suppose players got lost in comparisons,<br/>letting [Sao10K/14B-Qwen2.5-Kunou-v1](https://huggingface.co/Sao10K/14B-Qwen2.5-Kunou-v1) the innocent villain hide behind [sometimesanotion/Qwenvergence-14B-v13-Prose-DS](https://huggingface.co/sometimesanotion/Qwenvergence-14B-v13-Prose-DS) the cynosure of all eyes,<br/>not to mention a drifting [deepcogito/cogito-v1-preview-qwen-14B](https://huggingface.co/deepcogito/cogito-v1-preview-qwen-14B) actually kills via misfire.<br/><br/>Combination of the three could survive a heroic epic life in fantasy worlds.<br/>In reality she is caught by a...<br/>jumping bullet on the cheek.** |
|:---:|
<small>*"If you want to go deep, don't take the blue pill."*</small>
```yaml
models:
  - model: Sao10K/14B-Qwen2.5-Kunou-v1
  - model: sometimesanotion/Qwenvergence-14B-v13-Prose-DS
    parameters:
      density: [0.16, 0.26, 0.36, 0.46, 0.56, 0.46, 0.36, 0.26, 0.16]
      weight: [0.166, 0.496, 0.496, 0.166, 0.166, 0.496, 0.496, 0.166]
  - model: deepcogito/cogito-v1-preview-qwen-14B
    parameters:
      density: [0.56, 0.46, 0.36, 0.26, 0.16, 0.26, 0.36, 0.46, 0.56]
      weight: [0.496, 0.166, 0.166, 0.496, 0.496, 0.166, 0.166, 0.496]
merge_method: breadcrumbs
base_model: Sao10K/14B-Qwen2.5-Kunou-v1
parameters:
    gamma: 0.06
    lambda: 0.96
tokenizer_source: base
dtype: bfloat16
```