Suparious commited on
Commit
2259410
1 Parent(s): 84b74dc

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +17 -2
README.md CHANGED
@@ -6,10 +6,25 @@ tags:
6
  - text-generation
7
  - autotrain_compatible
8
  - endpoints_compatible
 
 
 
 
 
 
 
 
9
  pipeline_tag: text-generation
10
  inference: false
11
  quantized_by: Suparious
12
  ---
13
- #
14
 
15
- **UPLOAD IN PROGRESS**
 
 
 
 
 
 
 
 
6
  - text-generation
7
  - autotrain_compatible
8
  - endpoints_compatible
9
+ - merge
10
+ - mergekit
11
+ - lazymergekit
12
+ - zhengr/MixTAO-7Bx2-MoE-v8.1
13
+ - jsfs11/MixtureofMerges-MoE-2x7b-v6
14
+ base_model:
15
+ - zhengr/MixTAO-7Bx2-MoE-v8.1
16
+ - jsfs11/MixtureofMerges-MoE-2x7b-v6
17
  pipeline_tag: text-generation
18
  inference: false
19
  quantized_by: Suparious
20
  ---
21
+ # jsfs11/MixtureofMerges-MoE-2x7b-SLERPv0.9b AWQ
22
 
23
+ - Model creator: [jsfs11](https://huggingface.co/jsfs11)
24
+ - Original model: [MixtureofMerges-MoE-2x7b-SLERPv0.9b](https://huggingface.co/jsfs11/MixtureofMerges-MoE-2x7b-SLERPv0.9b)
25
+
26
+ ## Model Summary
27
+
28
+ MixtureofMerges-MoE-2x7b-SLERPv0.9b is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
29
+ * [zhengr/MixTAO-7Bx2-MoE-v8.1](https://huggingface.co/zhengr/MixTAO-7Bx2-MoE-v8.1)
30
+ * [jsfs11/MixtureofMerges-MoE-2x7b-v6](https://huggingface.co/jsfs11/MixtureofMerges-MoE-2x7b-v6)