noneUsername commited on
Commit
7df9437
·
verified ·
1 Parent(s): 9545f67

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +68 -0
README.md ADDED
@@ -0,0 +1,68 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model:
3
+ - lars1234/Mistral-Small-24B-Instruct-2501-writer
4
+ ---
5
+ vllm (pretrained=/root/autodl-tmp/Mistral-Small-24B-Instruct-2501-writer,add_bos_token=true,max_model_len=4096,dtype=bfloat16), gen_kwargs: (None), limit: 250.0, num_fewshot: 5, batch_size: auto
6
+ |Tasks|Version| Filter |n-shot| Metric | |Value| |Stderr|
7
+ |-----|------:|----------------|-----:|-----------|---|----:|---|-----:|
8
+ |gsm8k| 3|flexible-extract| 5|exact_match|↑ |0.924|± |0.0168|
9
+ | | |strict-match | 5|exact_match|↑ |0.920|± |0.0172|
10
+
11
+ vllm (pretrained=/root/autodl-tmp/Mistral-Small-24B-Instruct-2501-writer,add_bos_token=true,max_model_len=4096,dtype=bfloat16), gen_kwargs: (None), limit: 500.0, num_fewshot: 5, batch_size: auto
12
+ |Tasks|Version| Filter |n-shot| Metric | |Value| |Stderr|
13
+ |-----|------:|----------------|-----:|-----------|---|----:|---|-----:|
14
+ |gsm8k| 3|flexible-extract| 5|exact_match|↑ |0.918|± |0.0123|
15
+ | | |strict-match | 5|exact_match|↑ |0.908|± |0.0129|
16
+
17
+ vllm (pretrained=/root/autodl-tmp/Mistral-Small-24B-Instruct-2501-writer,add_bos_token=true,max_model_len=4096,dtype=bfloat16,max_num_seqs=3), gen_kwargs: (None), limit: 15.0, num_fewshot: None, batch_size: 1
18
+ | Groups |Version|Filter|n-shot|Metric| |Value | |Stderr|
19
+ |------------------|------:|------|------|------|---|-----:|---|-----:|
20
+ |mmlu | 2|none | |acc |↑ |0.7977|± |0.0131|
21
+ | - humanities | 2|none | |acc |↑ |0.8256|± |0.0263|
22
+ | - other | 2|none | |acc |↑ |0.8205|± |0.0265|
23
+ | - social sciences| 2|none | |acc |↑ |0.8556|± |0.0256|
24
+ | - stem | 2|none | |acc |↑ |0.7263|± |0.0249|
25
+
26
+
27
+ vllm (pretrained=/root/autodl-tmp/70-512-df10,add_bos_token=true,max_model_len=4096,dtype=bfloat16), gen_kwargs: (None), limit: 250.0, num_fewshot: 5, batch_size: auto
28
+ |Tasks|Version| Filter |n-shot| Metric | |Value| |Stderr|
29
+ |-----|------:|----------------|-----:|-----------|---|----:|---|-----:|
30
+ |gsm8k| 3|flexible-extract| 5|exact_match|↑ | 0.92|± |0.0172|
31
+ | | |strict-match | 5|exact_match|↑ | 0.92|± |0.0172|
32
+
33
+ vllm (pretrained=/root/autodl-tmp/70-512-df10,add_bos_token=true,max_model_len=4096,dtype=bfloat16), gen_kwargs: (None), limit: 500.0, num_fewshot: 5, batch_size: auto
34
+ |Tasks|Version| Filter |n-shot| Metric | |Value| |Stderr|
35
+ |-----|------:|----------------|-----:|-----------|---|----:|---|-----:|
36
+ |gsm8k| 3|flexible-extract| 5|exact_match|↑ |0.910|± |0.0128|
37
+ | | |strict-match | 5|exact_match|↑ |0.906|± |0.0131|
38
+
39
+ vllm (pretrained=/root/autodl-tmp/70-512-df10,add_bos_token=true,max_model_len=4096,dtype=bfloat16,max_num_seqs=3), gen_kwargs: (None), limit: 15.0, num_fewshot: None, batch_size: 1
40
+ | Groups |Version|Filter|n-shot|Metric| |Value | |Stderr|
41
+ |------------------|------:|------|------|------|---|-----:|---|-----:|
42
+ |mmlu | 2|none | |acc |↑ |0.7836|± |0.0133|
43
+ | - humanities | 2|none | |acc |↑ |0.8103|± |0.0267|
44
+ | - other | 2|none | |acc |↑ |0.7949|± |0.0271|
45
+ | - social sciences| 2|none | |acc |↑ |0.8556|± |0.0252|
46
+ | - stem | 2|none | |acc |↑ |0.7123|± |0.0257|
47
+
48
+
49
+ vllm (pretrained=/root/autodl-tmp/86-256,add_bos_token=true,max_model_len=4096,dtype=bfloat16), gen_kwargs: (None), limit: 250.0, num_fewshot: 5, batch_size: auto
50
+ |Tasks|Version| Filter |n-shot| Metric | |Value| |Stderr|
51
+ |-----|------:|----------------|-----:|-----------|---|----:|---|-----:|
52
+ |gsm8k| 3|flexible-extract| 5|exact_match|↑ |0.928|± |0.0164|
53
+ | | |strict-match | 5|exact_match|↑ |0.924|± |0.0168|
54
+
55
+ vllm (pretrained=/root/autodl-tmp/86-256,add_bos_token=true,max_model_len=4096,dtype=bfloat16), gen_kwargs: (None), limit: 500.0, num_fewshot: 5, batch_size: auto
56
+ |Tasks|Version| Filter |n-shot| Metric | |Value| |Stderr|
57
+ |-----|------:|----------------|-----:|-----------|---|----:|---|-----:|
58
+ |gsm8k| 3|flexible-extract| 5|exact_match|↑ |0.928|± |0.0116|
59
+ | | |strict-match | 5|exact_match|↑ |0.920|± |0.0121|
60
+
61
+ vllm (pretrained=/root/autodl-tmp/86-256,add_bos_token=true,max_model_len=4096,dtype=bfloat16), gen_kwargs: (None), limit: 15.0, num_fewshot: None, batch_size: 1
62
+ | Groups |Version|Filter|n-shot|Metric| |Value | |Stderr|
63
+ |------------------|------:|------|------|------|---|-----:|---|-----:|
64
+ |mmlu | 2|none | |acc |↑ |0.7942|± |0.0131|
65
+ | - humanities | 2|none | |acc |↑ |0.8103|± |0.0270|
66
+ | - other | 2|none | |acc |↑ |0.7949|± |0.0280|
67
+ | - social sciences| 2|none | |acc |↑ |0.8556|± |0.0252|
68
+ | - stem | 2|none | |acc |↑ |0.7439|± |0.0242|