vllm (pretrained=/root/autodl-tmp/Austral-24B-Winton,add_bos_token=true,max_model_len=3096,dtype=bfloat16,trust_remote_code=true), gen_kwargs: (None), limit: 250.0, num_fewshot: 5, batch_size: auto
Tasks | Version | Filter | n-shot | Metric | Value | Stderr | ||
---|---|---|---|---|---|---|---|---|
gsm8k | 3 | flexible-extract | 5 | exact_match | ↑ | 0.912 | ± | 0.0180 |
strict-match | 5 | exact_match | ↑ | 0.908 | ± | 0.0183 |
vllm (pretrained=/root/autodl-tmp/Austral-24B-Winton,add_bos_token=true,max_model_len=3096,dtype=bfloat16,trust_remote_code=true), gen_kwargs: (None), limit: 500.0, num_fewshot: 5, batch_size: auto
Tasks | Version | Filter | n-shot | Metric | Value | Stderr | ||
---|---|---|---|---|---|---|---|---|
gsm8k | 3 | flexible-extract | 5 | exact_match | ↑ | 0.898 | ± | 0.0135 |
strict-match | 5 | exact_match | ↑ | 0.886 | ± | 0.0142 |
Groups | Version | Filter | n-shot | Metric | Value | Stderr | ||
---|---|---|---|---|---|---|---|---|
mmlu | 2 | none | acc | ↑ | 0.7977 | ± | 0.0130 | |
- humanities | 2 | none | acc | ↑ | 0.8462 | ± | 0.0249 | |
- other | 2 | none | acc | ↑ | 0.8103 | ± | 0.0270 | |
- social sciences | 2 | none | acc | ↑ | 0.8611 | ± | 0.0254 | |
- stem | 2 | none | acc | ↑ | 0.7158 | ± | 0.0253 |
vllm (pretrained=/root/autodl-tmp/root90-256-4096-9.9999,add_bos_token=true,max_model_len=3096,dtype=bfloat16,trust_remote_code=true), gen_kwargs: (None), limit: 250.0, num_fewshot: 5, batch_size: auto
Tasks | Version | Filter | n-shot | Metric | Value | Stderr | ||
---|---|---|---|---|---|---|---|---|
gsm8k | 3 | flexible-extract | 5 | exact_match | ↑ | 0.916 | ± | 0.0176 |
strict-match | 5 | exact_match | ↑ | 0.904 | ± | 0.0187 |
vllm (pretrained=/root/autodl-tmp/root90-256-4096-9.9999,add_bos_token=true,max_model_len=3096,dtype=bfloat16,trust_remote_code=true), gen_kwargs: (None), limit: 500.0, num_fewshot: 5, batch_size: auto
Tasks | Version | Filter | n-shot | Metric | Value | Stderr | ||
---|---|---|---|---|---|---|---|---|
gsm8k | 3 | flexible-extract | 5 | exact_match | ↑ | 0.904 | ± | 0.0132 |
strict-match | 5 | exact_match | ↑ | 0.882 | ± | 0.0144 |
vllm (pretrained=/root/autodl-tmp/root90-256-4096-9.9999,add_bos_token=true,max_model_len=3048,dtype=bfloat16,trust_remote_code=true), gen_kwargs: (None), limit: 15.0, num_fewshot: None, batch_size: auto
Groups | Version | Filter | n-shot | Metric | Value | Stderr | ||
---|---|---|---|---|---|---|---|---|
mmlu | 2 | none | acc | ↑ | 0.7977 | ± | 0.0132 | |
- humanities | 2 | none | acc | ↑ | 0.8359 | ± | 0.0257 | |
- other | 2 | none | acc | ↑ | 0.8308 | ± | 0.0260 | |
- social sciences | 2 | none | acc | ↑ | 0.8444 | ± | 0.0266 | |
- stem | 2 | none | acc | ↑ | 0.7193 | ± | 0.0257 |
- Downloads last month
- 6
Model tree for noneUsername/Austral-24B-Winton-W8A8
Base model
mistralai/Mistral-Small-3.1-24B-Base-2503