tdot604 commited on
Commit
b707e20
·
verified ·
1 Parent(s): 9149c56

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +71 -62
README.md CHANGED
@@ -1,62 +1,71 @@
1
-
2
- ---
3
- base_model:
4
- - TheDrummer/Moist-Miqu-70B-v1
5
- - Sao10K/L3-70B-Euryale-v2.1
6
- library_name: transformers
7
- tags:
8
- - mergekit
9
- - merge
10
-
11
- ---
12
- # Result
13
-
14
- This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
15
-
16
- ## Merge Details
17
- ### Merge Method
18
-
19
- This model was merged using the passthrough merge method.
20
-
21
- ### Models Merged
22
-
23
- The following models were included in the merge:
24
- * [TheDrummer/Moist-Miqu-70B-v1](https://huggingface.co/TheDrummer/Moist-Miqu-70B-v1)
25
- * [Sao10K/L3-70B-Euryale-v2.1](https://huggingface.co/Sao10K/L3-70B-Euryale-v2.1)
26
-
27
- ### Configuration
28
-
29
- The following YAML configuration was used to produce this model:
30
-
31
- ```yaml
32
- slices:
33
- - sources:
34
- - model: Sao10K/L3-70B-Euryale-v2.1
35
- layer_range: [0, 16]
36
- - sources:
37
- - model: TheDrummer/Moist-Miqu-70B-v1
38
- layer_range: [8, 24]
39
- - sources:
40
- - model: Sao10K/L3-70B-Euryale-v2.1
41
- layer_range: [17, 32]
42
- - sources:
43
- - model: TheDrummer/Moist-Miqu-70B-v1
44
- layer_range: [25, 40]
45
- - sources:
46
- - model: Sao10K/L3-70B-Euryale-v2.1
47
- layer_range: [33, 48]
48
- - sources:
49
- - model: TheDrummer/Moist-Miqu-70B-v1
50
- layer_range: [41, 56]
51
- - sources:
52
- - model: Sao10K/L3-70B-Euryale-v2.1
53
- layer_range: [49, 64]
54
- - sources:
55
- - model: TheDrummer/Moist-Miqu-70B-v1
56
- layer_range: [57, 72]
57
- - sources:
58
- - model: Sao10K/L3-70B-Euryale-v2.1
59
- layer_range: [65, 80]
60
- merge_method: passthrough
61
- dtype: float16
62
- ```
 
 
 
 
 
 
 
 
 
 
1
+ <!DOCTYPE html>
2
+ <html>
3
+ <head>
4
+ <title>Image Example</title>
5
+ </head>
6
+ <body>
7
+ <h1>My Website</h1>
8
+ <img src="https://huggingface.co/tdot604/Electric-Sheep-120b/resolve/main/2024-05-16_20-49-20_1415.png" alt="Example Image" width="1080" height="770">
9
+ </body>
10
+ </html>
11
+ ---
12
+ base_model:
13
+ - TheDrummer/Moist-Miqu-70B-v1
14
+ - Sao10K/L3-70B-Euryale-v2.1
15
+ library_name: transformers
16
+ tags:
17
+ - mergekit
18
+ - merge
19
+
20
+ ---
21
+ # Result
22
+
23
+ This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
24
+
25
+ ## Merge Details
26
+ ### Merge Method
27
+
28
+ This model was merged using the passthrough merge method.
29
+
30
+ ### Models Merged
31
+
32
+ The following models were included in the merge:
33
+ * [TheDrummer/Moist-Miqu-70B-v1](https://huggingface.co/TheDrummer/Moist-Miqu-70B-v1)
34
+ * [Sao10K/L3-70B-Euryale-v2.1](https://huggingface.co/Sao10K/L3-70B-Euryale-v2.1)
35
+
36
+ ### Configuration
37
+
38
+ The following YAML configuration was used to produce this model:
39
+
40
+ ```yaml
41
+ slices:
42
+ - sources:
43
+ - model: Sao10K/L3-70B-Euryale-v2.1
44
+ layer_range: [0, 16]
45
+ - sources:
46
+ - model: TheDrummer/Moist-Miqu-70B-v1
47
+ layer_range: [8, 24]
48
+ - sources:
49
+ - model: Sao10K/L3-70B-Euryale-v2.1
50
+ layer_range: [17, 32]
51
+ - sources:
52
+ - model: TheDrummer/Moist-Miqu-70B-v1
53
+ layer_range: [25, 40]
54
+ - sources:
55
+ - model: Sao10K/L3-70B-Euryale-v2.1
56
+ layer_range: [33, 48]
57
+ - sources:
58
+ - model: TheDrummer/Moist-Miqu-70B-v1
59
+ layer_range: [41, 56]
60
+ - sources:
61
+ - model: Sao10K/L3-70B-Euryale-v2.1
62
+ layer_range: [49, 64]
63
+ - sources:
64
+ - model: TheDrummer/Moist-Miqu-70B-v1
65
+ layer_range: [57, 72]
66
+ - sources:
67
+ - model: Sao10K/L3-70B-Euryale-v2.1
68
+ layer_range: [65, 80]
69
+ merge_method: passthrough
70
+ dtype: float16
71
+ ```