Update README.md
Browse files
README.md
CHANGED
@@ -14,4 +14,88 @@ tags:
|
|
14 |
|
15 |
# Model Card for ethicalabs/Kurtis-E1.1-Qwen3-4B
|
16 |
|
17 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
14 |
|
15 |
# Model Card for ethicalabs/Kurtis-E1.1-Qwen3-4B
|
16 |
|
17 |
+
|
18 |
+
Kurtis E1.1 fine-tuned with [flower](https://flower.ai/)
|
19 |
+
|
20 |
+
## Eval Results
|
21 |
+
|
22 |
+
Evaluation tasks were performed with the [LM Evaluation Harness](https://github.com/EleutherAI/lm-evaluation-harness) on a Mac Mini M4 Pro.
|
23 |
+
|
24 |
+
### mmlu
|
25 |
+
|
26 |
+
```
|
27 |
+
lm_eval --model hf --model_args pretrained=ethicalabs/Kurtis-E1.1-Qwen3-4B --tasks mmlu --device mps --batch_size 4
|
28 |
+
```
|
29 |
+
|
30 |
+
| Tasks |Version|Filter|n-shot|Metric| |Value | |Stderr|
|
31 |
+
|---------------------------------------|------:|------|-----:|------|---|-----:|---|-----:|
|
32 |
+
|mmlu | 2|none | |acc |↑ |0.6849|± |0.0037|
|
33 |
+
| - humanities | 2|none | |acc |↑ |0.5951|± |0.0067|
|
34 |
+
| - formal_logic | 1|none | 0|acc |↑ |0.5952|± |0.0439|
|
35 |
+
| - high_school_european_history | 1|none | 0|acc |↑ |0.7879|± |0.0319|
|
36 |
+
| - high_school_us_history | 1|none | 0|acc |↑ |0.8333|± |0.0262|
|
37 |
+
| - high_school_world_history | 1|none | 0|acc |↑ |0.8439|± |0.0236|
|
38 |
+
| - international_law | 1|none | 0|acc |↑ |0.7686|± |0.0385|
|
39 |
+
| - jurisprudence | 1|none | 0|acc |↑ |0.7685|± |0.0408|
|
40 |
+
| - logical_fallacies | 1|none | 0|acc |↑ |0.8037|± |0.0312|
|
41 |
+
| - moral_disputes | 1|none | 0|acc |↑ |0.7081|± |0.0245|
|
42 |
+
| - moral_scenarios | 1|none | 0|acc |↑ |0.3754|± |0.0162|
|
43 |
+
| - philosophy | 1|none | 0|acc |↑ |0.7170|± |0.0256|
|
44 |
+
| - prehistory | 1|none | 0|acc |↑ |0.7346|± |0.0246|
|
45 |
+
| - professional_law | 1|none | 0|acc |↑ |0.4844|± |0.0128|
|
46 |
+
| - world_religions | 1|none | 0|acc |↑ |0.7778|± |0.0319|
|
47 |
+
| - other | 2|none | |acc |↑ |0.7161|± |0.0078|
|
48 |
+
| - business_ethics | 1|none | 0|acc |↑ |0.7300|± |0.0446|
|
49 |
+
| - clinical_knowledge | 1|none | 0|acc |↑ |0.7396|± |0.0270|
|
50 |
+
| - college_medicine | 1|none | 0|acc |↑ |0.7168|± |0.0344|
|
51 |
+
| - global_facts | 1|none | 0|acc |↑ |0.3300|± |0.0473|
|
52 |
+
| - human_aging | 1|none | 0|acc |↑ |0.6771|± |0.0314|
|
53 |
+
| - management | 1|none | 0|acc |↑ |0.8155|± |0.0384|
|
54 |
+
| - marketing | 1|none | 0|acc |↑ |0.8675|± |0.0222|
|
55 |
+
| - medical_genetics | 1|none | 0|acc |↑ |0.7600|± |0.0429|
|
56 |
+
| - miscellaneous | 1|none | 0|acc |↑ |0.8008|± |0.0143|
|
57 |
+
| - nutrition | 1|none | 0|acc |↑ |0.7255|± |0.0256|
|
58 |
+
| - professional_accounting | 1|none | 0|acc |↑ |0.5390|± |0.0297|
|
59 |
+
| - professional_medicine | 1|none | 0|acc |↑ |0.7390|± |0.0267|
|
60 |
+
| - virology | 1|none | 0|acc |↑ |0.5000|± |0.0389|
|
61 |
+
| - social sciences | 2|none | |acc |↑ |0.7813|± |0.0074|
|
62 |
+
| - econometrics | 1|none | 0|acc |↑ |0.6228|± |0.0456|
|
63 |
+
| - high_school_geography | 1|none | 0|acc |↑ |0.8283|± |0.0269|
|
64 |
+
| - high_school_government_and_politics| 1|none | 0|acc |↑ |0.8756|± |0.0238|
|
65 |
+
| - high_school_macroeconomics | 1|none | 0|acc |↑ |0.7590|± |0.0217|
|
66 |
+
| - high_school_microeconomics | 1|none | 0|acc |↑ |0.8151|± |0.0252|
|
67 |
+
| - high_school_psychology | 1|none | 0|acc |↑ |0.8679|± |0.0145|
|
68 |
+
| - human_sexuality | 1|none | 0|acc |↑ |0.7405|± |0.0384|
|
69 |
+
| - professional_psychology | 1|none | 0|acc |↑ |0.7173|± |0.0182|
|
70 |
+
| - public_relations | 1|none | 0|acc |↑ |0.6818|± |0.0446|
|
71 |
+
| - security_studies | 1|none | 0|acc |↑ |0.7265|± |0.0285|
|
72 |
+
| - sociology | 1|none | 0|acc |↑ |0.8308|± |0.0265|
|
73 |
+
| - us_foreign_policy | 1|none | 0|acc |↑ |0.8100|± |0.0394|
|
74 |
+
| - stem | 2|none | |acc |↑ |0.6943|± |0.0079|
|
75 |
+
| - abstract_algebra | 1|none | 0|acc |↑ |0.5700|± |0.0498|
|
76 |
+
| - anatomy | 1|none | 0|acc |↑ |0.6370|± |0.0415|
|
77 |
+
| - astronomy | 1|none | 0|acc |↑ |0.8092|± |0.0320|
|
78 |
+
| - college_biology | 1|none | 0|acc |↑ |0.8333|± |0.0312|
|
79 |
+
| - college_chemistry | 1|none | 0|acc |↑ |0.5400|± |0.0501|
|
80 |
+
| - college_computer_science | 1|none | 0|acc |↑ |0.6600|± |0.0476|
|
81 |
+
| - college_mathematics | 1|none | 0|acc |↑ |0.5700|± |0.0498|
|
82 |
+
| - college_physics | 1|none | 0|acc |↑ |0.5784|± |0.0491|
|
83 |
+
| - computer_security | 1|none | 0|acc |↑ |0.7800|± |0.0416|
|
84 |
+
| - conceptual_physics | 1|none | 0|acc |↑ |0.7787|± |0.0271|
|
85 |
+
| - electrical_engineering | 1|none | 0|acc |↑ |0.7586|± |0.0357|
|
86 |
+
| - elementary_mathematics | 1|none | 0|acc |↑ |0.6878|± |0.0239|
|
87 |
+
| - high_school_biology | 1|none | 0|acc |↑ |0.8742|± |0.0189|
|
88 |
+
| - high_school_chemistry | 1|none | 0|acc |↑ |0.7192|± |0.0316|
|
89 |
+
| - high_school_computer_science | 1|none | 0|acc |↑ |0.8500|± |0.0359|
|
90 |
+
| - high_school_mathematics | 1|none | 0|acc |↑ |0.4741|± |0.0304|
|
91 |
+
| - high_school_physics | 1|none | 0|acc |↑ |0.6225|± |0.0396|
|
92 |
+
| - high_school_statistics | 1|none | 0|acc |↑ |0.7083|± |0.0310|
|
93 |
+
| - machine_learning | 1|none | 0|acc |↑ |0.5268|± |0.0474|
|
94 |
+
|
95 |
+
| Groups |Version|Filter|n-shot|Metric| |Value | |Stderr|
|
96 |
+
|------------------|------:|------|------|------|---|-----:|---|-----:|
|
97 |
+
|mmlu | 2|none | |acc |↑ |0.6849|± |0.0037|
|
98 |
+
| - humanities | 2|none | |acc |↑ |0.5951|± |0.0067|
|
99 |
+
| - other | 2|none | |acc |↑ |0.7161|± |0.0078|
|
100 |
+
| - social sciences| 2|none | |acc |↑ |0.7813|± |0.0074|
|
101 |
+
| - stem | 2|none | |acc |↑ |0.6943|± |0.0079|
|