Transformers
GGUF
English
quantized
roleplay
imatrix
mistral
Lewdiculous commited on
Commit
9e88a38
1 Parent(s): ce98d83

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +31 -9
README.md CHANGED
@@ -12,15 +12,18 @@ tags:
12
  - roleplay
13
  - imatrix
14
  - mistral
 
15
  inference: false
16
  ---
17
 
18
- This repository hosts GGUF-Imatrix quantizations for [ChaoticNeutrals/Eris_Floramix_DPO_7B](https://huggingface.co/ChaoticNeutrals/Eris_Floramix_DPO_7B).
 
 
19
 
20
  ```python
21
  quantization_options = [
22
- "Q3_K_M", "Q4_K_M", "Q5_K_M", "Q6_K",
23
- "Q8_0", "IQ4_XS", "IQ3_XXS"
24
  ]
25
  ```
26
 
@@ -30,14 +33,33 @@ For imatrix data generation, kalomaze's `groups_merged.txt` with added roleplay
30
 
31
  The goal is to measure the (hopefully positive) impact of this data for consistent formatting in roleplay chatting scenarios.
32
 
33
- **Image:**
34
 
35
- ![image/png](https://cdn-uploads.huggingface.co/production/uploads/65d4cf2693a0a3744a27536c/1kuWjS4TnCIkCcdZdPfsG.png)
36
 
37
- **Original model information:**
 
 
38
 
39
- # Eris Floramix DPO
40
 
41
- This is a mix between Eris Remix DPO and Flora DPO, a finetune of the original Eris Remix on the Synthetic_Soul_1k dataset.
42
 
43
- Applied this DPO dataset: https://huggingface.co/datasets/athirdpath/DPO_Pairs-Roleplay-Alpaca-NSFW-v1-SHUFFLED
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
12
  - roleplay
13
  - imatrix
14
  - mistral
15
+ - merge
16
  inference: false
17
  ---
18
 
19
+ This repository hosts GGUF-Imatrix quantizations for [Test157t/Eris-Daturamix-7b](https://huggingface.co/Test157t/Eris-Daturamix-7b).
20
+
21
+ To be uploaded:
22
 
23
  ```python
24
  quantization_options = [
25
+ "Q4_K_M", "IQ4_XS", "Q5_K_M", "Q6_K",
26
+ "Q8_0", "IQ3_M", "IQ3_S", "IQ3_XXS"
27
  ]
28
  ```
29
 
 
33
 
34
  The goal is to measure the (hopefully positive) impact of this data for consistent formatting in roleplay chatting scenarios.
35
 
36
+ **Original model information:**
37
 
38
+ ![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/642265bc01c62c1e4102dc36/FtEEuyGni5M-cxkYYBBHw.jpeg)
39
 
40
+ The following models were included in the merge:
41
+ * [ResplendentAI/Datura_7B](https://huggingface.co/ResplendentAI/Datura_7B)
42
+ * [Test157t/Eris-Floramix-7b](https://huggingface.co/Test157t/Eris-Floramix-7b)
43
 
44
+ ### Configuration
45
 
46
+ The following YAML configuration was used to produce this model:
47
 
48
+ ```yaml
49
+ slices:
50
+ - sources:
51
+ - model: Test157t/Eris-Floramix-7b
52
+ layer_range: [0, 32]
53
+ - model: ResplendentAI/Datura_7B
54
+ layer_range: [0, 32]
55
+ merge_method: slerp
56
+ base_model: Test157t/Eris-Floramix-7b
57
+ parameters:
58
+ t:
59
+ - filter: self_attn
60
+ value: [0, 0.5, 0.3, 0.7, 1]
61
+ - filter: mlp
62
+ value: [1, 0.5, 0.7, 0.3, 0]
63
+ - value: 0.5
64
+ dtype: bfloat16
65
+ ```