Update README.md
Browse files
README.md
CHANGED
@@ -32,6 +32,11 @@ library_name: transformers
|
|
32 |
|
33 |
Coder MOE with 3 top coder models in a Mixture of Experts config, using the full power of each model to code.
|
34 |
|
|
|
|
|
|
|
|
|
|
|
35 |
Three models all working together to code.
|
36 |
|
37 |
Default config is 2 experts (of 3 activated).
|
|
|
32 |
|
33 |
Coder MOE with 3 top coder models in a Mixture of Experts config, using the full power of each model to code.
|
34 |
|
35 |
+
Included:
|
36 |
+
- Qwen/Qwen2.5-Coder-7B-Instruct (500+ likes)
|
37 |
+
- open-r1/OlympicCoder-7B (179+ likes)
|
38 |
+
- microsoft/NextCoder-7B ; in Float 32 ; recently released. (includes new coding methods)
|
39 |
+
|
40 |
Three models all working together to code.
|
41 |
|
42 |
Default config is 2 experts (of 3 activated).
|