File size: 1,039 Bytes
84b74dc
2a05335
 
84b74dc
2a05335
 
 
 
 
84b74dc
 
 
 
 
 
2259410
 
 
 
 
84b74dc
2259410
84b74dc
2259410
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
---
base_model: jsfs11/MixtureofMerges-MoE-2x7b-SLERPv0.9b
inference: false
library_name: transformers
merged_models:
- zhengr/MixTAO-7Bx2-MoE-v8.1
- jsfs11/MixtureofMerges-MoE-2x7b-v6
pipeline_tag: text-generation
quantized_by: Suparious
tags:
- 4-bit
- AWQ
- text-generation
- autotrain_compatible
- endpoints_compatible
- merge
- mergekit
- lazymergekit
- zhengr/MixTAO-7Bx2-MoE-v8.1
- jsfs11/MixtureofMerges-MoE-2x7b-v6
---
# jsfs11/MixtureofMerges-MoE-2x7b-SLERPv0.9b AWQ

- Model creator: [jsfs11](https://huggingface.co/jsfs11)
- Original model: [MixtureofMerges-MoE-2x7b-SLERPv0.9b](https://huggingface.co/jsfs11/MixtureofMerges-MoE-2x7b-SLERPv0.9b)

## Model Summary

MixtureofMerges-MoE-2x7b-SLERPv0.9b is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
* [zhengr/MixTAO-7Bx2-MoE-v8.1](https://huggingface.co/zhengr/MixTAO-7Bx2-MoE-v8.1)
* [jsfs11/MixtureofMerges-MoE-2x7b-v6](https://huggingface.co/jsfs11/MixtureofMerges-MoE-2x7b-v6)