Update README.md
Browse files
README.md
CHANGED
@@ -2,18 +2,18 @@
|
|
2 |
license: cc-by-nc-4.0
|
3 |
---
|
4 |
|
5 |
-
# Mixtral
|
6 |
|
7 |
|
8 |
|
9 |
-
MoE
|
10 |
|
11 |
|
12 |
* [NurtureAI/neural-chat-7b-v3-16k](https://huggingface.co/NurtureAI/neural-chat-7b-v3-16k)
|
13 |
* [mncai/mistral-7b-dpo-v6](https://huggingface.co/mncai/mistral-7b-dpo-v6)
|
14 |
|
15 |
|
16 |
-
metrics:
|
17 |
Average 73.43
|
18 |
ARC 71.25
|
19 |
HellaSwag 87.45
|
|
|
2 |
license: cc-by-nc-4.0
|
3 |
---
|
4 |
|
5 |
+
# Mixtral MOE 2x7B
|
6 |
|
7 |
|
8 |
|
9 |
+
MoE of the following models :
|
10 |
|
11 |
|
12 |
* [NurtureAI/neural-chat-7b-v3-16k](https://huggingface.co/NurtureAI/neural-chat-7b-v3-16k)
|
13 |
* [mncai/mistral-7b-dpo-v6](https://huggingface.co/mncai/mistral-7b-dpo-v6)
|
14 |
|
15 |
|
16 |
+
* metrics:
|
17 |
Average 73.43
|
18 |
ARC 71.25
|
19 |
HellaSwag 87.45
|