Update README.md
Browse files
README.md
CHANGED
@@ -31,7 +31,7 @@ license: mit
|
|
31 |
|
32 |
**Highlights**
|
33 |
- Half the size of SOTA models like QWQ-32b and EXAONE-32b and hence **memory efficient**.
|
34 |
-
- It consumes
|
35 |
- On par or outperforms on tasks like - MBPP, BFCL, Enterprise RAG, MT Bench, MixEval, IFEval and Multi-Challenge making it great for **Agentic / Enterprise tasks**.
|
36 |
- Competitive performance on academic benchmarks like AIME-24 AIME-25, AMC-23, MATH-500 and GPQA considering model size.
|
37 |
|
|
|
31 |
|
32 |
**Highlights**
|
33 |
- Half the size of SOTA models like QWQ-32b and EXAONE-32b and hence **memory efficient**.
|
34 |
+
- It consumes **40%** less tokens compared to QWQ-32b, making it super efficient in production. πππ
|
35 |
- On par or outperforms on tasks like - MBPP, BFCL, Enterprise RAG, MT Bench, MixEval, IFEval and Multi-Challenge making it great for **Agentic / Enterprise tasks**.
|
36 |
- Competitive performance on academic benchmarks like AIME-24 AIME-25, AMC-23, MATH-500 and GPQA considering model size.
|
37 |
|