File size: 1,982 Bytes
86e6ce7
 
 
 
 
 
 
8b33f00
 
 
 
 
 
 
 
 
 
86e6ce7
1a1c284
8b33f00
 
86e6ce7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
8b33f00
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
---
base_model:
- grimjim/Magnolia-v3-Gemma2-8k-9B
library_name: transformers
tags:
- mergekit
- merge
- roleplay
- story-writing
- adventure
- gemma-2
- rp
- nsfw
language:
- en
- zh
- ja
---
| <img style="float:left;margin-right:0.4em" src="https://qu.ax/gGdYM.webp"> **For RP & story gen,<br/>a nice fine-tuning of Gemma-2-9B could surprise you with some highly creative and authentic expressions way surpassing its size, which Gemma-3-12B even got no match.<br/>Yet the glitches are obvious too, and hard to ignore.<br/>As it's like breaking a perfect sentence with one word so weird<br/>that it may totally come from another language...<br/><br/>Among tons of works trying to stabilize the bitch,<br/>I enjoy [grimjim/Magnolia-v3-Gemma2-8k-9B](https://huggingface.co/grimjim/Magnolia-v3-Gemma2-8k-9B) the most.<br/>So I picked the rich [recoilme/recoilme-gemma-2-9B-v0.2](https://huggingface.co/recoilme/recoilme-gemma-2-9B-v0.2) plus the strong [lemon07r/Gemma-2-Ataraxy-v4d-9B](https://huggingface.co/lemon07r/Gemma-2-Ataraxy-v4d-9B) to tame it with one last merge.<br/>And failed again...<br/><br/>It's just slightly smarter, more sensitive to NSFW directions with a little rebellious tendency.<br/>So keep retrying and editing.<br/>It's 9B, after all.** |
|:---:|
<small>*"It feels few steps to perfection, 'cause it's google."*</small>
```yaml
models:
  - model: grimjim/Magnolia-v3-Gemma2-8k-9B
  - model: recoilme/recoilme-gemma-2-9B-v0.2
    parameters:
      density: [0.5, 0.7, 0.6, 0.7, 0.5]
      epsilon: [0.05, 0.07, 0.06, 0.07, 0.05]
      weight: [-0.01150, 0.01793, -0.01034, 0.01855, -0.01876]
  - model: lemon07r/Gemma-2-Ataraxy-v4d-9B
    parameters:
      density: [0.5, 0.3, 0.4, 0.3, 0.5]
      epsilon: [0.05, 0.03, 0.04, 0.03, 0.05]
      weight: [0.01763, -0.01992, 0.01975, -0.01096, 0.01951]
merge_method: della
base_model: grimjim/Magnolia-v3-Gemma2-8k-9B
parameters:
    normalize: false
    lambda: 0.66
tokenizer_source: base
dtype: bfloat16
```