Update README.md
#1
by
432653dfg
- opened
README.md
CHANGED
@@ -10,8 +10,6 @@ tags:
|
|
10 |
|
11 |
Experimental RP-oriented MoE, the idea was to get a model that would be equal to or better than the Mixtral 8x7B and it's finetunes in RP/ERP tasks.
|
12 |
|
13 |
-
GGUF Q5_K_S, Q4_K_S and Q3_K_L soon
|
14 |
-
|
15 |
### ChaoticSoliloquy-4x8B
|
16 |
```
|
17 |
base_model: jeiku_Chaos_RP_l3_8B
|
|
|
10 |
|
11 |
Experimental RP-oriented MoE, the idea was to get a model that would be equal to or better than the Mixtral 8x7B and it's finetunes in RP/ERP tasks.
|
12 |
|
|
|
|
|
13 |
### ChaoticSoliloquy-4x8B
|
14 |
```
|
15 |
base_model: jeiku_Chaos_RP_l3_8B
|