paulilioaica
commited on
Commit
•
3eb8d18
1
Parent(s):
2e80735
Update README.md
Browse files
README.md
CHANGED
@@ -32,6 +32,18 @@ PhiMiX-2x2B is a Mixure of Experts (MoE) made with the following models using me
|
|
32 |
* [mergekit](https://github.com/cg123/mergekit) code which I tweaked (you can find the PhiConfig [here](https://github.com/cg123/mergekit/blob/508348ae34be17ea0a95d0a288a6e34491a2558a/mergekit/architecture.py#L289))
|
33 |
by mainly adding the config in the `moe_mixtral.py` script from `mixtral` branch.
|
34 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
35 |
|
36 |
## 🧩 Configuration
|
37 |
|
|
|
32 |
* [mergekit](https://github.com/cg123/mergekit) code which I tweaked (you can find the PhiConfig [here](https://github.com/cg123/mergekit/blob/508348ae34be17ea0a95d0a288a6e34491a2558a/mergekit/architecture.py#L289))
|
33 |
by mainly adding the config in the `moe_mixtral.py` script from `mixtral` branch.
|
34 |
|
35 |
+
## ⏱️ Benchmarks
|
36 |
+
|
37 |
+
| Model |AGIEval|GPT4All|TruthfulQA|Bigbench|Average|
|
38 |
+
|----------------------------------------------------------------|------:|------:|---------:|-------:|------:|
|
39 |
+
|[**PhiMiX-2x2B**](https://huggingface.co/paulilioaica/PhiMiX-2x2B) | 33.34 | **71.75** | - | - | - |
|
40 |
+
|[phixtral-4x2_8](https://huggingface.co/mlabonne/phixtral-4x2_8)| 33.91| 70.44| 48.78| 37.68| 47.7|
|
41 |
+
|[phixtral-2x2_8](https://huggingface.co/mlabonne/phixtral-2x2_8)| 34.1| 70.44| 48.78| 37.82| 47.78|
|
42 |
+
|[_phi-2-orange_](https://huggingface.co/rhysjones/phi-2-orange)| 33.37| 71.33| 49.87| 37.3| 47.97|
|
43 |
+
|[_dolphin-2_6-phi-2_](https://huggingface.co/cognitivecomputations/dolphin-2_6-phi-2)| _33.12_| _69.85_| _47.39_| _37.2_| _46.89_|
|
44 |
+
|
45 |
+
I have used __bold__ to highlight this merge from the list, and _italics_ to highlight it's base modes used in the merge,
|
46 |
+
and then __bold__ in the cells where it exceeds the performance of either.
|
47 |
|
48 |
## 🧩 Configuration
|
49 |
|