Update README.md
Browse files
README.md
CHANGED
@@ -6,4 +6,5 @@ tags:
|
|
6 |
|
7 |
# MixTAO-7Bx2-MoE
|
8 |
|
9 |
-
MixTAO-7Bx2-MoE is a Mixure of Experts (MoE).
|
|
|
|
6 |
|
7 |
# MixTAO-7Bx2-MoE
|
8 |
|
9 |
+
MixTAO-7Bx2-MoE is a Mixure of Experts (MoE).
|
10 |
+
This model is mainly used for large model technology experiments, and increasingly perfect iterations will eventually create high-level large language models.
|