AIgotahole commited on
Commit
9dec4ff
·
verified ·
1 Parent(s): b3cf99c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +20 -23
README.md CHANGED
@@ -8,29 +8,26 @@ library_name: transformers
8
  tags:
9
  - mergekit
10
  - merge
11
-
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
12
  ---
13
- # merge
14
-
15
- This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
16
-
17
- ## Merge Details
18
- ### Merge Method
19
-
20
- This model was merged using the [Karcher Mean](https://en.wikipedia.org/wiki/Karcher_mean) merge method.
21
-
22
- ### Models Merged
23
-
24
- The following models were included in the merge:
25
- * [grimjim/FranFran-Something-12B](https://huggingface.co/grimjim/FranFran-Something-12B)
26
- * [Nitral-AI/Nera_Noctis-12B](https://huggingface.co/Nitral-AI/Nera_Noctis-12B)
27
- * [Nohobby/MN-12B-Siskin-v0.2](https://huggingface.co/Nohobby/MN-12B-Siskin-v0.2)
28
- * [BarBarickoza/Dans-SakuraKaze-Picaro-12b](https://huggingface.co/BarBarickoza/Dans-SakuraKaze-Picaro-12b)
29
-
30
- ### Configuration
31
-
32
- The following YAML configuration was used to produce this model:
33
-
34
  ```yaml
35
  models:
36
  - model: BarBarickoza/Dans-SakuraKaze-Picaro-12b
@@ -48,4 +45,4 @@ parameters:
48
  tokenizer_source: base
49
  dtype: float32
50
  out_dtype: bfloat16
51
- ```
 
8
  tags:
9
  - mergekit
10
  - merge
11
+ - roleplay
12
+ - story-writing
13
+ - adventure
14
+ - gemma-2
15
+ - rp
16
+ - nsfw
17
+ language:
18
+ - en
19
+ - zh
20
+ - ja
21
+ - fr
22
+ - ko
23
+ - de
24
+ - ru
25
+ - es
26
+ - pt
27
  ---
28
+ | <img style="float:left;margin-right:0.4em" src="https://qu.ax/OcEFu.webp"> **For RP & story gen,<br/>fine-tunings of Mistral-Nemo-12B ignite the fire, setting the golden standard between strategy & efficiency, leaving players with confidence over entertainment.<br/>It's blunt and proactive, showcasing the core of datasets in various manners.<br/>Within its power range everything is brilliant;<br/>out of it, absolute mess...<br/><br/>I tried so much of that,<br/>enjoying both the wild [BarBarickoza/Dans-SakuraKaze-Picaro-12b](https://huggingface.co/BarBarickoza/Dans-SakuraKaze-Picaro-12b) and the cool [Nitral-AI/Nera_Noctis-12B](https://huggingface.co/Nitral-AI/Nera_Noctis-12B),<br/>reckoning a classic [Nohobby/MN-12B-Siskin-v0.2](https://huggingface.co/Nohobby/MN-12B-Siskin-v0.2) plus an avant [grimjim/FranFran-Something-12B](https://huggingface.co/grimjim/FranFran-Something-12B) could make a sexy hybrid.<br/>And it smells yummy indeed.<br/><br/>Now the potentiality is deeper with more restrained sanity touching all burning boundaries.<br/>Each retry bleeds.<br/>Don't dose over 12B.** |
29
+ |:---:|
30
+ <small>*"This works so well that this doesn't matter at all."*</small>
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
31
  ```yaml
32
  models:
33
  - model: BarBarickoza/Dans-SakuraKaze-Picaro-12b
 
45
  tokenizer_source: base
46
  dtype: float32
47
  out_dtype: bfloat16
48
+ ```