Update README.md
Browse files
README.md
CHANGED
@@ -1,4 +1,8 @@
|
|
1 |
---
|
|
|
|
|
|
|
|
|
2 |
license: cc-by-nc-4.0
|
3 |
tags:
|
4 |
- moe
|
@@ -8,6 +12,7 @@ tags:
|
|
8 |
# Experimental 3x7B model
|
9 |
|
10 |
A Experimental MoE Model that custom for all-rounded Roleplay. Well understand Character Card and high logic.
|
|
|
11 |
|
12 |
If you want 32k context length capable, you could try those versions:
|
13 |
|
|
|
1 |
---
|
2 |
+
base model:
|
3 |
+
- Sao10K/Frostwind-v2.1-m7
|
4 |
+
- SanjiWatsuki/Kunoichi-DPO-v2-7B
|
5 |
+
- macadeliccc/WestLake-7B-v2-laser-truthy-dpo
|
6 |
license: cc-by-nc-4.0
|
7 |
tags:
|
8 |
- moe
|
|
|
12 |
# Experimental 3x7B model
|
13 |
|
14 |
A Experimental MoE Model that custom for all-rounded Roleplay. Well understand Character Card and high logic.
|
15 |
+
Thank all the origin model author: Sao10K, SanjiWatsuki, macadeliccc, for create those model. Pardon me that I want hide the recipe. :(
|
16 |
|
17 |
If you want 32k context length capable, you could try those versions:
|
18 |
|