eval_name
stringlengths 12
111
| Precision
stringclasses 3
values | Type
stringclasses 6
values | T
stringclasses 6
values | Weight type
stringclasses 3
values | Architecture
stringclasses 56
values | Model
stringlengths 355
689
| fullname
stringlengths 4
102
| Model sha
stringlengths 0
40
| Average ⬆️
float64 1.03
52
| Hub License
stringclasses 25
values | Hub ❤️
int64 0
5.93k
| #Params (B)
float64 -1
141
| Available on the hub
bool 2
classes | MoE
bool 2
classes | Flagged
bool 2
classes | Chat Template
bool 2
classes | CO₂ cost (kg)
float64 0.03
107
| IFEval Raw
float64 0
0.9
| IFEval
float64 0
90
| BBH Raw
float64 0.27
0.75
| BBH
float64 0.81
63.5
| MATH Lvl 5 Raw
float64 0
0.51
| MATH Lvl 5
float64 0
50.7
| GPQA Raw
float64 0.22
0.44
| GPQA
float64 0
24.9
| MUSR Raw
float64 0.29
0.6
| MUSR
float64 0
38.5
| MMLU-PRO Raw
float64 0.1
0.73
| MMLU-PRO
float64 0
70
| Merged
bool 2
classes | Official Providers
bool 2
classes | Upload To Hub Date
stringclasses 442
values | Submission Date
stringclasses 186
values | Generation
int64 0
10
| Base Model
stringlengths 4
102
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0-hero_Matter-0.2-7B-DPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/0-hero/Matter-0.2-7B-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">0-hero/Matter-0.2-7B-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/0-hero__Matter-0.2-7B-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | 0-hero/Matter-0.2-7B-DPO | 26a66f0d862e2024ce4ad0a09c37052ac36e8af6 | 8.805656 | apache-2.0 | 3 | 7.242 | true | false | false | true | 0.609587 | 0.330279 | 33.027921 | 0.359625 | 10.055525 | 0.008308 | 0.830816 | 0.259228 | 1.230425 | 0.381375 | 5.871875 | 0.116356 | 1.817376 | false | false | 2024-04-13 | 2024-08-05 | 0 | 0-hero/Matter-0.2-7B-DPO |
01-ai_Yi-1.5-34B_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/01-ai/Yi-1.5-34B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-1.5-34B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-1.5-34B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | 01-ai/Yi-1.5-34B | 4b486f81c935a2dadde84c6baa1e1370d40a098f | 25.621318 | apache-2.0 | 47 | 34.389 | true | false | false | false | 11.351699 | 0.284117 | 28.411725 | 0.597639 | 42.749363 | 0.151813 | 15.181269 | 0.365772 | 15.436242 | 0.423604 | 11.217188 | 0.466589 | 40.732122 | false | true | 2024-05-11 | 2024-06-12 | 0 | 01-ai/Yi-1.5-34B |
01-ai_Yi-1.5-34B-32K_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/01-ai/Yi-1.5-34B-32K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-1.5-34B-32K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-1.5-34B-32K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | 01-ai/Yi-1.5-34B-32K | 2c03a29761e4174f20347a60fbe229be4383d48b | 26.67756 | apache-2.0 | 36 | 34.389 | true | false | false | false | 11.577314 | 0.311869 | 31.186917 | 0.601569 | 43.381847 | 0.151057 | 15.10574 | 0.363255 | 15.100671 | 0.439823 | 14.077865 | 0.470911 | 41.212323 | false | true | 2024-05-15 | 2024-06-12 | 0 | 01-ai/Yi-1.5-34B-32K |
01-ai_Yi-1.5-34B-Chat_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/01-ai/Yi-1.5-34B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-1.5-34B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-1.5-34B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | 01-ai/Yi-1.5-34B-Chat | f3128b2d02d82989daae566c0a7eadc621ca3254 | 32.892233 | apache-2.0 | 258 | 34.389 | true | false | false | true | 11.211922 | 0.606676 | 60.667584 | 0.608375 | 44.262826 | 0.249245 | 24.924471 | 0.364933 | 15.324385 | 0.428198 | 13.058073 | 0.452045 | 39.116061 | false | true | 2024-05-10 | 2024-06-12 | 0 | 01-ai/Yi-1.5-34B-Chat |
01-ai_Yi-1.5-34B-Chat-16K_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/01-ai/Yi-1.5-34B-Chat-16K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-1.5-34B-Chat-16K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-1.5-34B-Chat-16K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | 01-ai/Yi-1.5-34B-Chat-16K | ff74452e11f0f749ab872dc19b1dd3813c25c4d8 | 29.239909 | apache-2.0 | 26 | 34.389 | true | false | false | true | 3.387011 | 0.45645 | 45.645 | 0.610022 | 44.536157 | 0.203927 | 20.392749 | 0.338087 | 11.744966 | 0.43976 | 13.736719 | 0.454455 | 39.383865 | false | true | 2024-05-15 | 2024-07-15 | 0 | 01-ai/Yi-1.5-34B-Chat-16K |
01-ai_Yi-1.5-6B_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/01-ai/Yi-1.5-6B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-1.5-6B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-1.5-6B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | 01-ai/Yi-1.5-6B | cab51fce425b4c1fb19fccfdd96bd5d0908c1657 | 16.695346 | apache-2.0 | 30 | 6.061 | true | false | false | false | 1.184757 | 0.26166 | 26.166017 | 0.449258 | 22.027905 | 0.063444 | 6.344411 | 0.313758 | 8.501119 | 0.437406 | 13.309115 | 0.314412 | 23.823508 | false | true | 2024-05-11 | 2024-08-10 | 0 | 01-ai/Yi-1.5-6B |
01-ai_Yi-1.5-6B-Chat_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/01-ai/Yi-1.5-6B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-1.5-6B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-1.5-6B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | 01-ai/Yi-1.5-6B-Chat | 3f64d3f159c6ad8494227bb77e2a7baef8cd808b | 20.983906 | apache-2.0 | 41 | 6.061 | true | false | false | true | 0.96884 | 0.514527 | 51.452701 | 0.457131 | 23.678723 | 0.054381 | 5.438066 | 0.302013 | 6.935123 | 0.439177 | 14.030469 | 0.319315 | 24.368351 | false | true | 2024-05-11 | 2024-10-22 | 0 | 01-ai/Yi-1.5-6B-Chat |
01-ai_Yi-1.5-9B_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/01-ai/Yi-1.5-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-1.5-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-1.5-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | 01-ai/Yi-1.5-9B | 8cfde9604384c50137bee480b8cef8a08e5ae81d | 22.141313 | apache-2.0 | 47 | 8.829 | true | false | false | false | 0.734446 | 0.293584 | 29.358436 | 0.514294 | 30.500717 | 0.113293 | 11.329305 | 0.379195 | 17.225951 | 0.432781 | 12.03099 | 0.391622 | 32.402482 | false | true | 2024-05-11 | 2024-06-12 | 0 | 01-ai/Yi-1.5-9B |
01-ai_Yi-1.5-9B-32K_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/01-ai/Yi-1.5-9B-32K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-1.5-9B-32K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-1.5-9B-32K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | 01-ai/Yi-1.5-9B-32K | 116561dfae63af90f9d163b43077629e0e916bb1 | 19.78461 | apache-2.0 | 18 | 8.829 | true | false | false | false | 0.784037 | 0.230311 | 23.031113 | 0.496332 | 28.937012 | 0.106495 | 10.649547 | 0.35906 | 14.541387 | 0.418615 | 10.826823 | 0.376496 | 30.721779 | false | true | 2024-05-15 | 2024-06-12 | 0 | 01-ai/Yi-1.5-9B-32K |
01-ai_Yi-1.5-9B-Chat_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/01-ai/Yi-1.5-9B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-1.5-9B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-1.5-9B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | 01-ai/Yi-1.5-9B-Chat | bc87d8557c98dc1e5fdef6ec23ed31088c4d3f35 | 27.894417 | apache-2.0 | 136 | 8.829 | true | false | false | true | 0.726772 | 0.604553 | 60.455259 | 0.555906 | 36.952931 | 0.127644 | 12.76435 | 0.334732 | 11.297539 | 0.425906 | 12.838281 | 0.397523 | 33.058141 | false | true | 2024-05-10 | 2024-06-12 | 0 | 01-ai/Yi-1.5-9B-Chat |
01-ai_Yi-1.5-9B-Chat-16K_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/01-ai/Yi-1.5-9B-Chat-16K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-1.5-9B-Chat-16K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-1.5-9B-Chat-16K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | 01-ai/Yi-1.5-9B-Chat-16K | 2b397e5f0fab87984efa66856c5c4ed4bbe68b50 | 23.035282 | apache-2.0 | 34 | 8.829 | true | false | false | true | 0.792373 | 0.421404 | 42.14041 | 0.515338 | 31.497609 | 0.134441 | 13.444109 | 0.308725 | 7.829978 | 0.409906 | 10.038281 | 0.399352 | 33.261303 | false | true | 2024-05-15 | 2024-06-12 | 0 | 01-ai/Yi-1.5-9B-Chat-16K |
01-ai_Yi-34B_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/01-ai/Yi-34B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-34B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-34B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | 01-ai/Yi-34B | e1e7da8c75cfd5c44522228599fd4d2990cedd1c | 22.385715 | apache-2.0 | 1,287 | 34.389 | true | false | false | false | 12.828742 | 0.304575 | 30.457519 | 0.54571 | 35.542431 | 0.052115 | 5.21148 | 0.366611 | 15.548098 | 0.411854 | 9.648438 | 0.441157 | 37.906324 | false | true | 2023-11-01 | 2024-06-12 | 0 | 01-ai/Yi-34B |
01-ai_Yi-34B-200K_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/01-ai/Yi-34B-200K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-34B-200K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-34B-200K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | 01-ai/Yi-34B-200K | 8ac1a1ebe011df28b78ccd08012aeb2222443c77 | 19.887594 | apache-2.0 | 317 | 34.389 | true | false | false | false | 12.751928 | 0.154249 | 15.424851 | 0.544182 | 36.02211 | 0.049849 | 4.984894 | 0.356544 | 14.205817 | 0.381719 | 9.414844 | 0.453457 | 39.27305 | false | true | 2023-11-06 | 2024-06-12 | 0 | 01-ai/Yi-34B-200K |
01-ai_Yi-34B-Chat_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/01-ai/Yi-34B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-34B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-34B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | 01-ai/Yi-34B-Chat | 2e528b6a80fb064a0a746c5ca43114b135e30464 | 23.962312 | apache-2.0 | 345 | 34.389 | true | false | false | true | 12.562848 | 0.469889 | 46.988878 | 0.556087 | 37.623988 | 0.046828 | 4.682779 | 0.338087 | 11.744966 | 0.397844 | 8.363802 | 0.409325 | 34.369459 | false | true | 2023-11-22 | 2024-06-12 | 0 | 01-ai/Yi-34B-Chat |
01-ai_Yi-6B_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/01-ai/Yi-6B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-6B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-6B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | 01-ai/Yi-6B | 7f7fb7662fd8ec09029364f408053c954986c8e5 | 13.611617 | apache-2.0 | 372 | 6.061 | true | false | false | false | 0.549275 | 0.289338 | 28.933785 | 0.430923 | 19.408505 | 0.015861 | 1.586103 | 0.269295 | 2.572707 | 0.393687 | 7.044271 | 0.299119 | 22.124335 | false | true | 2023-11-01 | 2024-06-12 | 0 | 01-ai/Yi-6B |
01-ai_Yi-6B-200K_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/01-ai/Yi-6B-200K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-6B-200K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-6B-200K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | 01-ai/Yi-6B-200K | 4a74338e778a599f313e9fa8f5bc08c717604420 | 11.933158 | apache-2.0 | 173 | 6.061 | true | false | false | false | 0.563212 | 0.084331 | 8.433069 | 0.428929 | 20.14802 | 0.01435 | 1.435045 | 0.281879 | 4.250559 | 0.45874 | 16.842448 | 0.284408 | 20.489805 | false | true | 2023-11-06 | 2024-06-12 | 0 | 01-ai/Yi-6B-200K |
01-ai_Yi-6B-Chat_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/01-ai/Yi-6B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-6B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-6B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | 01-ai/Yi-6B-Chat | 01f7fabb6cfb26efeb764da4a0a19cad2c754232 | 14.004357 | apache-2.0 | 64 | 6.061 | true | false | false | true | 0.555333 | 0.339521 | 33.952136 | 0.41326 | 17.000167 | 0.006798 | 0.679758 | 0.294463 | 5.928412 | 0.368792 | 3.565625 | 0.3061 | 22.900044 | false | true | 2023-11-22 | 2024-06-12 | 0 | 01-ai/Yi-6B-Chat |
01-ai_Yi-9B_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/01-ai/Yi-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | 01-ai/Yi-9B | b4a466d95091696285409f1dcca3028543cb39da | 17.774103 | apache-2.0 | 186 | 8.829 | true | false | false | false | 0.765332 | 0.270878 | 27.087794 | 0.493961 | 27.626956 | 0.053625 | 5.362538 | 0.317953 | 9.060403 | 0.405406 | 8.909115 | 0.35738 | 28.597813 | false | true | 2024-03-01 | 2024-06-12 | 0 | 01-ai/Yi-9B |
01-ai_Yi-9B-200K_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/01-ai/Yi-9B-200K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-9B-200K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-9B-200K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | 01-ai/Yi-9B-200K | 8c93accd5589dbb74ee938e103613508c4a9b88d | 17.74214 | apache-2.0 | 75 | 8.829 | true | false | false | false | 0.774491 | 0.232709 | 23.270921 | 0.47933 | 26.492495 | 0.067221 | 6.722054 | 0.315436 | 8.724832 | 0.429406 | 12.109115 | 0.362201 | 29.133422 | false | true | 2024-03-15 | 2024-06-12 | 0 | 01-ai/Yi-9B-200K |
01-ai_Yi-Coder-9B-Chat_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/01-ai/Yi-Coder-9B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-Coder-9B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-Coder-9B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | 01-ai/Yi-Coder-9B-Chat | 356a1f8d4e4a606d0b879e54191ca809918576b8 | 16.872696 | apache-2.0 | 193 | 8.829 | true | false | false | true | 0.909766 | 0.481704 | 48.17041 | 0.48142 | 25.943153 | 0.033233 | 3.323263 | 0.247483 | 0 | 0.399177 | 7.963802 | 0.24252 | 15.83555 | false | true | 2024-08-21 | 2024-09-12 | 1 | 01-ai/Yi-Coder-9B |
152334H_miqu-1-70b-sf_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/152334H/miqu-1-70b-sf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">152334H/miqu-1-70b-sf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/152334H__miqu-1-70b-sf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | 152334H/miqu-1-70b-sf | 1dca4cce36f01f2104ee2e6b97bac6ff7bb300c1 | 29.059643 | 219 | 68.977 | false | false | false | false | 6.098986 | 0.518174 | 51.8174 | 0.610236 | 43.807147 | 0.122356 | 12.23565 | 0.350671 | 13.422819 | 0.458208 | 17.209375 | 0.422789 | 35.86547 | false | false | 2024-01-30 | 2024-06-26 | 0 | 152334H/miqu-1-70b-sf |
|
1TuanPham_T-VisStar-7B-v0.1_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/1TuanPham/T-VisStar-7B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">1TuanPham/T-VisStar-7B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/1TuanPham__T-VisStar-7B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | 1TuanPham/T-VisStar-7B-v0.1 | b111b59971c14b46c888b96723ff7f3c7b6fd92f | 19.044104 | apache-2.0 | 2 | 7.294 | true | false | false | true | 1.269513 | 0.360704 | 36.070404 | 0.50522 | 30.243834 | 0.05136 | 5.135952 | 0.285235 | 4.697987 | 0.4375 | 13.554167 | 0.321061 | 24.562278 | true | false | 2024-09-19 | 2024-09-22 | 0 | 1TuanPham/T-VisStar-7B-v0.1 |
1TuanPham_T-VisStar-v0.1_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/1TuanPham/T-VisStar-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">1TuanPham/T-VisStar-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/1TuanPham__T-VisStar-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | 1TuanPham/T-VisStar-v0.1 | c9779bd9630a533f7e42fd8effcca69623d48c9c | 19.044104 | apache-2.0 | 2 | 7.294 | true | false | false | true | 0.624384 | 0.360704 | 36.070404 | 0.50522 | 30.243834 | 0.05136 | 5.135952 | 0.285235 | 4.697987 | 0.4375 | 13.554167 | 0.321061 | 24.562278 | true | false | 2024-09-19 | 2024-09-20 | 0 | 1TuanPham/T-VisStar-v0.1 |
3rd-Degree-Burn_L-3.1-Science-Writer-8B_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/3rd-Degree-Burn/L-3.1-Science-Writer-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">3rd-Degree-Burn/L-3.1-Science-Writer-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/3rd-Degree-Burn__L-3.1-Science-Writer-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | 3rd-Degree-Burn/L-3.1-Science-Writer-8B | d9bb11fb02f8eca3aec408912278e513377115da | 21.07862 | 0 | 8.03 | false | false | false | false | 0.709678 | 0.42625 | 42.625013 | 0.504131 | 29.199301 | 0.102719 | 10.271903 | 0.274329 | 3.243848 | 0.395948 | 11.69349 | 0.364943 | 29.438165 | false | false | 2024-11-19 | 0 | Removed |
||
3rd-Degree-Burn_Llama-3.1-8B-Squareroot_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/3rd-Degree-Burn/Llama-3.1-8B-Squareroot" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">3rd-Degree-Burn/Llama-3.1-8B-Squareroot</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/3rd-Degree-Burn__Llama-3.1-8B-Squareroot-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | 3rd-Degree-Burn/Llama-3.1-8B-Squareroot | 2bec01c2c5d53276eac2222c80190eb44ab2e6af | 10.581747 | apache-2.0 | 1 | 8.03 | true | false | false | true | 0.98705 | 0.221344 | 22.134381 | 0.346094 | 8.618064 | 0.227341 | 22.734139 | 0.256711 | 0.894855 | 0.308917 | 0.78125 | 0.17495 | 8.327793 | true | false | 2024-10-10 | 2024-10-10 | 1 | 3rd-Degree-Burn/Llama-3.1-8B-Squareroot (Merge) |
3rd-Degree-Burn_Llama-3.1-8B-Squareroot-v1_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/3rd-Degree-Burn/Llama-3.1-8B-Squareroot-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">3rd-Degree-Burn/Llama-3.1-8B-Squareroot-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/3rd-Degree-Burn__Llama-3.1-8B-Squareroot-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | 3rd-Degree-Burn/Llama-3.1-8B-Squareroot-v1 | 09339d9c3b118ae3c6e7beab8b84347471990988 | 7.597362 | 0 | 8.03 | false | false | false | true | 0.772749 | 0.289238 | 28.923811 | 0.334277 | 6.515145 | 0.061934 | 6.193353 | 0.255872 | 0.782998 | 0.334063 | 1.757812 | 0.112699 | 1.411052 | false | false | 2024-11-10 | 0 | Removed |
||
3rd-Degree-Burn_Llama-Squared-8B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/3rd-Degree-Burn/Llama-Squared-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">3rd-Degree-Burn/Llama-Squared-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/3rd-Degree-Burn__Llama-Squared-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | 3rd-Degree-Burn/Llama-Squared-8B | f30737e92b3a3fa0ef2a3f3ade487cc94ad34400 | 12.233544 | 0 | 8.03 | false | false | false | true | 1.011112 | 0.275524 | 27.55245 | 0.443103 | 21.277103 | 0.045317 | 4.531722 | 0.271812 | 2.908277 | 0.308948 | 1.951823 | 0.236619 | 15.179891 | false | false | 2024-10-08 | 0 | Removed |
||
4season_final_model_test_v2_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/4season/final_model_test_v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">4season/final_model_test_v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/4season__final_model_test_v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | 4season/final_model_test_v2 | cf690c35d9cf0b0b6bf034fa16dbf88c56fe861c | 21.91554 | apache-2.0 | 0 | 21.421 | true | false | false | false | 1.081038 | 0.319113 | 31.911329 | 0.634205 | 47.41067 | 0.013595 | 1.359517 | 0.327181 | 10.290828 | 0.431448 | 12.43099 | 0.352809 | 28.089908 | false | false | 2024-05-20 | 2024-06-27 | 0 | 4season/final_model_test_v2 |
AALF_FuseChat-Llama-3.1-8B-Instruct-preview_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/AALF/FuseChat-Llama-3.1-8B-Instruct-preview" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AALF/FuseChat-Llama-3.1-8B-Instruct-preview</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AALF__FuseChat-Llama-3.1-8B-Instruct-preview-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | AALF/FuseChat-Llama-3.1-8B-Instruct-preview | f740497979293c90fa1cfaa7c446016e107cc2c1 | 25.610368 | 7 | 8.03 | true | false | false | true | 0.688619 | 0.718958 | 71.895792 | 0.511989 | 30.848065 | 0.070242 | 7.024169 | 0.305369 | 7.38255 | 0.382 | 6.15 | 0.373255 | 30.361628 | false | false | 2024-11-20 | 2024-11-20 | 0 | AALF/FuseChat-Llama-3.1-8B-Instruct-preview |
|
AALF_FuseChat-Llama-3.1-8B-SFT-preview_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/AALF/FuseChat-Llama-3.1-8B-SFT-preview" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AALF/FuseChat-Llama-3.1-8B-SFT-preview</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AALF__FuseChat-Llama-3.1-8B-SFT-preview-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | AALF/FuseChat-Llama-3.1-8B-SFT-preview | 601f2b8c448acc5686656d3979ed732ce050b827 | 27.374839 | 0 | 8.03 | true | false | false | true | 0.684308 | 0.72805 | 72.805046 | 0.52403 | 32.536782 | 0.114048 | 11.404834 | 0.30453 | 7.270694 | 0.402 | 9.75 | 0.374335 | 30.481678 | false | false | 2024-11-20 | 2024-11-21 | 0 | AALF/FuseChat-Llama-3.1-8B-SFT-preview |
|
AALF_gemma-2-27b-it-SimPO-37K_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/AALF/gemma-2-27b-it-SimPO-37K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AALF/gemma-2-27b-it-SimPO-37K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AALF__gemma-2-27b-it-SimPO-37K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | AALF/gemma-2-27b-it-SimPO-37K | 27f15219df2000a16955c9403c3f38b5f3413b3d | 9.298079 | gemma | 18 | 27.227 | true | false | false | true | 9.997722 | 0.240653 | 24.065258 | 0.391134 | 15.307881 | 0 | 0 | 0.280201 | 4.026846 | 0.34876 | 1.595052 | 0.197141 | 10.79344 | false | false | 2024-08-13 | 2024-09-05 | 2 | google/gemma-2-27b |
AALF_gemma-2-27b-it-SimPO-37K-100steps_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/AALF/gemma-2-27b-it-SimPO-37K-100steps" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AALF/gemma-2-27b-it-SimPO-37K-100steps</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AALF__gemma-2-27b-it-SimPO-37K-100steps-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | AALF/gemma-2-27b-it-SimPO-37K-100steps | d5cbf18b2eb90b77f5ddbb74cfcaeedfa692c90c | 9.894336 | gemma | 11 | 27.227 | true | false | false | true | 9.856735 | 0.256764 | 25.676427 | 0.393082 | 15.261078 | 0 | 0 | 0.288591 | 5.145414 | 0.332917 | 0.78125 | 0.212517 | 12.501847 | false | false | 2024-08-13 | 2024-09-21 | 2 | google/gemma-2-27b |
AELLM_gemma-2-aeria-infinity-9b_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/AELLM/gemma-2-aeria-infinity-9b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AELLM/gemma-2-aeria-infinity-9b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AELLM__gemma-2-aeria-infinity-9b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | AELLM/gemma-2-aeria-infinity-9b | 24e1de07258925d5ddb52134b66e2eb0d698dc11 | 28.344029 | 1 | 9.242 | false | false | false | true | 3.003789 | 0.7594 | 75.93995 | 0.598334 | 42.090214 | 0 | 0 | 0.333893 | 11.185682 | 0.401969 | 9.046094 | 0.38622 | 31.802231 | false | false | 2024-10-09 | 2024-10-09 | 1 | AELLM/gemma-2-aeria-infinity-9b (Merge) |
|
AELLM_gemma-2-lyco-infinity-9b_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/AELLM/gemma-2-lyco-infinity-9b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AELLM/gemma-2-lyco-infinity-9b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AELLM__gemma-2-lyco-infinity-9b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | AELLM/gemma-2-lyco-infinity-9b | 2941a682fcbcfea3f1485c9e0691cc1d9edc742e | 27.204937 | 0 | 10.159 | false | false | false | true | 2.97852 | 0.731648 | 73.164758 | 0.583953 | 39.787539 | 0 | 0 | 0.32802 | 10.402685 | 0.400635 | 8.91276 | 0.378657 | 30.961879 | false | false | 2024-10-09 | 2024-10-09 | 1 | AELLM/gemma-2-lyco-infinity-9b (Merge) |
|
AGI-0_Artificium-llama3.1-8B-001_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/AGI-0/Artificium-llama3.1-8B-001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AGI-0/Artificium-llama3.1-8B-001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AGI-0__Artificium-llama3.1-8B-001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | AGI-0/Artificium-llama3.1-8B-001 | 6bf3dcca3b75a06a4e04e5f944e709cccf4673fd | 19.063822 | 0 | 8.03 | false | false | false | true | 1.860327 | 0.524769 | 52.476872 | 0.425622 | 19.348898 | 0.110272 | 11.02719 | 0.26594 | 2.12528 | 0.379458 | 5.165625 | 0.318152 | 24.239066 | false | false | 2024-09-08 | 0 | Removed |
||
AGI-0_smartllama3.1-8B-001_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/AGI-0/smartllama3.1-8B-001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AGI-0/smartllama3.1-8B-001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AGI-0__smartllama3.1-8B-001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | AGI-0/smartllama3.1-8B-001 | 974d5ee685f1be003a1d8d08e907fe672d225035 | 20.23573 | 0 | 8.03 | false | false | false | false | 0.718834 | 0.351787 | 35.178659 | 0.467018 | 24.857737 | 0.11858 | 11.858006 | 0.306208 | 7.494407 | 0.438646 | 14.397396 | 0.348654 | 27.628177 | false | false | 2024-11-25 | 0 | Removed |
||
AI-MO_NuminaMath-7B-CoT_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/AI-MO/NuminaMath-7B-CoT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AI-MO/NuminaMath-7B-CoT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AI-MO__NuminaMath-7B-CoT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | AI-MO/NuminaMath-7B-CoT | ff7e3044218efe64128bd9c21f9ec66c3de04324 | 13.097309 | apache-2.0 | 18 | 6.91 | true | false | false | true | 0.745989 | 0.268854 | 26.885442 | 0.431419 | 19.152364 | 0.088369 | 8.836858 | 0.26594 | 2.12528 | 0.330344 | 0.826302 | 0.286818 | 20.757609 | false | false | 2024-07-15 | 2024-09-10 | 1 | deepseek-ai/deepseek-math-7b-base |
AI-MO_NuminaMath-7B-TIR_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/AI-MO/NuminaMath-7B-TIR" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AI-MO/NuminaMath-7B-TIR</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AI-MO__NuminaMath-7B-TIR-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | AI-MO/NuminaMath-7B-TIR | c6e394cc0579423c9cde6df6cc192c07dae73388 | 11.815723 | apache-2.0 | 321 | 6.91 | true | false | false | false | 1.07411 | 0.275624 | 27.562423 | 0.414369 | 16.873547 | 0.018882 | 1.888218 | 0.258389 | 1.118568 | 0.350927 | 4.199219 | 0.273271 | 19.252364 | false | false | 2024-07-04 | 2024-07-11 | 1 | deepseek-ai/deepseek-math-7b-base |
AI-Sweden-Models_Llama-3-8B-instruct_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/AI-Sweden-Models/Llama-3-8B-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AI-Sweden-Models/Llama-3-8B-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AI-Sweden-Models__Llama-3-8B-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | AI-Sweden-Models/Llama-3-8B-instruct | 4e1c955228bdb4d69c1c4560e8d5872312a8f033 | 13.777204 | llama3 | 10 | 8.03 | true | false | false | true | 1.166111 | 0.240128 | 24.012841 | 0.417346 | 18.388096 | 0.004532 | 0.453172 | 0.26594 | 2.12528 | 0.477094 | 19.936719 | 0.259724 | 17.747119 | false | false | 2024-06-01 | 2024-06-27 | 2 | meta-llama/Meta-Llama-3-8B |
AI-Sweden-Models_gpt-sw3-40b_float16 | float16 | 🟢 pretrained | 🟢 | Original | GPT2LMHeadModel | <a target="_blank" href="https://huggingface.co/AI-Sweden-Models/gpt-sw3-40b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AI-Sweden-Models/gpt-sw3-40b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AI-Sweden-Models__gpt-sw3-40b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | AI-Sweden-Models/gpt-sw3-40b | 1af27994df1287a7fac1b10d60e40ca43a22a385 | 4.734433 | other | 10 | 39.927 | true | false | false | false | 2.959819 | 0.14703 | 14.702988 | 0.326774 | 6.894934 | 0.009063 | 0.906344 | 0.234899 | 0 | 0.36324 | 2.838281 | 0.127576 | 3.064051 | false | false | 2023-02-22 | 2024-06-26 | 0 | AI-Sweden-Models/gpt-sw3-40b |
Aashraf995_Creative-7B-nerd_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Aashraf995/Creative-7B-nerd" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Aashraf995/Creative-7B-nerd</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Aashraf995__Creative-7B-nerd-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Aashraf995/Creative-7B-nerd | fc24bca48549ef8e39cbee5a438e5a16e25e4afa | 29.46208 | apache-2.0 | 2 | 7 | true | false | false | false | 0.648867 | 0.472187 | 47.218713 | 0.560679 | 37.080154 | 0.285498 | 28.549849 | 0.326342 | 10.178971 | 0.451542 | 14.942708 | 0.449219 | 38.802083 | true | false | 2024-12-13 | 2024-12-13 | 1 | Aashraf995/Creative-7B-nerd (Merge) |
Aashraf995_Gemma-Evo-10B_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/Aashraf995/Gemma-Evo-10B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Aashraf995/Gemma-Evo-10B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Aashraf995__Gemma-Evo-10B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Aashraf995/Gemma-Evo-10B | 5ec9c5763ca6662dd897cd292e08014ec10b0d74 | 33.961272 | apache-2.0 | 2 | 10 | true | false | false | false | 2.298016 | 0.733221 | 73.322119 | 0.604435 | 43.424559 | 0.200906 | 20.090634 | 0.354027 | 13.870246 | 0.459479 | 16.668229 | 0.427527 | 36.391844 | true | false | 2024-12-13 | 2024-12-13 | 1 | Aashraf995/Gemma-Evo-10B (Merge) |
Aashraf995_Qwen-Evo-7B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Aashraf995/Qwen-Evo-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Aashraf995/Qwen-Evo-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Aashraf995__Qwen-Evo-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Aashraf995/Qwen-Evo-7B | 641aac3f105805414efe0a55b18736dce73da0a0 | 30.035884 | apache-2.0 | 1 | 7 | true | false | false | false | 0.634008 | 0.475734 | 47.573438 | 0.570936 | 38.585327 | 0.299849 | 29.984894 | 0.325503 | 10.067114 | 0.454146 | 15.534896 | 0.446227 | 38.469637 | true | false | 2024-12-13 | 2024-12-13 | 1 | Aashraf995/Qwen-Evo-7B (Merge) |
Aashraf995_QwenStock-14B_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Aashraf995/QwenStock-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Aashraf995/QwenStock-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Aashraf995__QwenStock-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Aashraf995/QwenStock-14B | b91871dcd31fe2e445c233a449d021b47ebfe1fb | 36.73979 | apache-2.0 | 1 | 14 | true | false | false | false | 1.874946 | 0.500863 | 50.086327 | 0.655013 | 50.433899 | 0.333837 | 33.383686 | 0.389262 | 18.568233 | 0.47926 | 19.274219 | 0.538231 | 48.692376 | true | false | 2024-12-13 | 2024-12-13 | 1 | Aashraf995/QwenStock-14B (Merge) |
AbacusResearch_Jallabi-34B_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/AbacusResearch/Jallabi-34B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AbacusResearch/Jallabi-34B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AbacusResearch__Jallabi-34B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | AbacusResearch/Jallabi-34B | f65696da4ed82c9a20e94b200d9dccffa07af682 | 25.972084 | apache-2.0 | 2 | 34.389 | true | false | false | false | 3.286492 | 0.35286 | 35.286041 | 0.602338 | 43.615765 | 0.039275 | 3.927492 | 0.338926 | 11.856823 | 0.482177 | 20.238802 | 0.468168 | 40.90758 | false | false | 2024-03-01 | 2024-06-27 | 0 | AbacusResearch/Jallabi-34B |
Alibaba-NLP_gte-Qwen2-7B-instruct_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Alibaba-NLP/gte-Qwen2-7B-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Alibaba-NLP/gte-Qwen2-7B-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Alibaba-NLP__gte-Qwen2-7B-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Alibaba-NLP/gte-Qwen2-7B-instruct | e26182b2122f4435e8b3ebecbf363990f409b45b | 13.40618 | apache-2.0 | 238 | 7.613 | true | false | false | true | 2.172113 | 0.22554 | 22.554045 | 0.449514 | 21.925482 | 0.03852 | 3.851964 | 0.244966 | 0 | 0.355854 | 6.315104 | 0.332114 | 25.790485 | false | false | 2024-06-15 | 2024-08-05 | 0 | Alibaba-NLP/gte-Qwen2-7B-instruct |
Alsebay_Qwen2.5-7B-test-novelist_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Alsebay/Qwen2.5-7B-test-novelist" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Alsebay/Qwen2.5-7B-test-novelist</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Alsebay__Qwen2.5-7B-test-novelist-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Alsebay/Qwen2.5-7B-test-novelist | 89f34e5e67378dc38ce0da19d347ea26c23fbca5 | 25.939214 | apache-2.0 | 1 | 7 | true | false | false | false | 0.667193 | 0.53516 | 53.516004 | 0.515122 | 30.4175 | 0.160876 | 16.087613 | 0.291107 | 5.480984 | 0.474885 | 18.29401 | 0.386553 | 31.83917 | false | false | 2024-12-12 | 2024-12-12 | 3 | Qwen/Qwen2.5-7B |
ArliAI_ArliAI-RPMax-12B-v1.1_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/ArliAI/ArliAI-RPMax-12B-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ArliAI/ArliAI-RPMax-12B-v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ArliAI__ArliAI-RPMax-12B-v1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | ArliAI/ArliAI-RPMax-12B-v1.1 | 645db1cf8ad952eb57854a133e8e15303b898b04 | 20.812694 | apache-2.0 | 41 | 12.248 | true | false | false | true | 1.833402 | 0.534885 | 53.488522 | 0.475182 | 24.809063 | 0.102719 | 10.271903 | 0.281879 | 4.250559 | 0.361844 | 5.563802 | 0.338431 | 26.492317 | false | false | 2024-08-31 | 2024-09-05 | 0 | ArliAI/ArliAI-RPMax-12B-v1.1 |
ArliAI_Llama-3.1-8B-ArliAI-RPMax-v1.1_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/ArliAI/Llama-3.1-8B-ArliAI-RPMax-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ArliAI/Llama-3.1-8B-ArliAI-RPMax-v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ArliAI__Llama-3.1-8B-ArliAI-RPMax-v1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | ArliAI/Llama-3.1-8B-ArliAI-RPMax-v1.1 | 540bd352e59c63900af91b95a932b33aaee70c76 | 23.916967 | llama3 | 28 | 8.03 | true | false | false | true | 0.892745 | 0.635902 | 63.590163 | 0.501561 | 28.787014 | 0.129909 | 12.990937 | 0.283557 | 4.474273 | 0.357688 | 5.310938 | 0.355136 | 28.348478 | false | false | 2024-08-23 | 2024-09-19 | 0 | ArliAI/Llama-3.1-8B-ArliAI-RPMax-v1.1 |
Artples_L-MChat-7b_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/Artples/L-MChat-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Artples/L-MChat-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Artples__L-MChat-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Artples/L-MChat-7b | e10137f5cbfc1b73068d6473e4a87241cca0b3f4 | 21.225905 | apache-2.0 | 2 | 7.242 | true | false | false | true | 0.592226 | 0.529665 | 52.966462 | 0.460033 | 24.201557 | 0.09139 | 9.138973 | 0.305369 | 7.38255 | 0.402865 | 8.12474 | 0.32987 | 25.54115 | true | false | 2024-04-02 | 2024-07-07 | 1 | Artples/L-MChat-7b (Merge) |
Artples_L-MChat-Small_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | PhiForCausalLM | <a target="_blank" href="https://huggingface.co/Artples/L-MChat-Small" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Artples/L-MChat-Small</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Artples__L-MChat-Small-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Artples/L-MChat-Small | 52484c277f6062c12dc6d6b6397ee0d0c21b0126 | 14.891449 | mit | 1 | 2.78 | true | false | false | true | 0.465511 | 0.328706 | 32.870561 | 0.482256 | 26.856516 | 0.017372 | 1.73716 | 0.267617 | 2.348993 | 0.369594 | 9.265885 | 0.246426 | 16.269577 | true | false | 2024-04-11 | 2024-07-07 | 1 | Artples/L-MChat-Small (Merge) |
Aryanne_SuperHeart_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Aryanne/SuperHeart" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Aryanne/SuperHeart</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Aryanne__SuperHeart-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Aryanne/SuperHeart | 02b5050d7e600ce3db81a19638f6043c895d60cf | 25.267673 | llama3.1 | 1 | 8.03 | true | false | false | false | 0.903959 | 0.519223 | 51.922344 | 0.521538 | 31.893554 | 0.138973 | 13.897281 | 0.301174 | 6.823266 | 0.443573 | 14.713281 | 0.391207 | 32.356309 | true | false | 2024-09-23 | 2024-09-23 | 1 | Aryanne/SuperHeart (Merge) |
AtAndDev_Qwen2.5-1.5B-continuous-learnt_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/AtAndDev/Qwen2.5-1.5B-continuous-learnt" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AtAndDev/Qwen2.5-1.5B-continuous-learnt</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AtAndDev__Qwen2.5-1.5B-continuous-learnt-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | AtAndDev/Qwen2.5-1.5B-continuous-learnt | 01c0981db9cf0f146fe050065f17343af75a8aa6 | 16.518524 | 0 | 1.544 | false | false | false | true | 0.673035 | 0.460521 | 46.052142 | 0.425775 | 19.537666 | 0.074773 | 7.477341 | 0.26594 | 2.12528 | 0.363646 | 3.789063 | 0.281167 | 20.129654 | false | false | 2024-10-13 | 0 | Removed |
||
AtAndDev_Qwen2.5-1.5B-continuous-learnt_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/AtAndDev/Qwen2.5-1.5B-continuous-learnt" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AtAndDev/Qwen2.5-1.5B-continuous-learnt</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AtAndDev__Qwen2.5-1.5B-continuous-learnt-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | AtAndDev/Qwen2.5-1.5B-continuous-learnt | 01c0981db9cf0f146fe050065f17343af75a8aa6 | 16.45133 | 0 | 1.544 | false | false | false | true | 0.688585 | 0.451054 | 45.105431 | 0.42747 | 19.766409 | 0.085347 | 8.534743 | 0.270134 | 2.684564 | 0.362281 | 2.551823 | 0.280585 | 20.065012 | false | false | 2024-10-18 | 0 | Removed |
||
AuraIndustries_Aura-4B_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/AuraIndustries/Aura-4B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AuraIndustries/Aura-4B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AuraIndustries__Aura-4B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | AuraIndustries/Aura-4B | 808d578b460382ddc90f8828a4dcd1c58deb7045 | 16.076068 | apache-2.0 | 6 | 4 | true | false | false | true | 0.579992 | 0.381562 | 38.156203 | 0.449041 | 22.640857 | 0.043051 | 4.305136 | 0.287752 | 5.033557 | 0.393844 | 7.363802 | 0.270612 | 18.956856 | false | false | 2024-12-12 | 2024-12-13 | 1 | AuraIndustries/Aura-4B (Merge) |
AuraIndustries_Aura-8B_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/AuraIndustries/Aura-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AuraIndustries/Aura-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AuraIndustries__Aura-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | AuraIndustries/Aura-8B | d7f840c57c89fd655690a8371ce8f5c82f57ad80 | 27.338121 | apache-2.0 | 7 | 8 | true | false | false | true | 0.639714 | 0.720532 | 72.053152 | 0.513123 | 30.981348 | 0.150302 | 15.030211 | 0.286074 | 4.809843 | 0.400448 | 9.222656 | 0.387384 | 31.931516 | false | false | 2024-12-08 | 2024-12-10 | 1 | AuraIndustries/Aura-8B (Merge) |
AuraIndustries_Aura-MoE-2x4B_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/AuraIndustries/Aura-MoE-2x4B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AuraIndustries/Aura-MoE-2x4B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AuraIndustries__Aura-MoE-2x4B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | AuraIndustries/Aura-MoE-2x4B | 82e9951d78355fd6b37c2a54778df2948e1b52a9 | 16.772802 | apache-2.0 | 0 | 4 | true | true | false | true | 0.997388 | 0.460097 | 46.009699 | 0.433851 | 20.613848 | 0.029456 | 2.945619 | 0.271812 | 2.908277 | 0.40851 | 9.830469 | 0.26496 | 18.328901 | false | false | 2024-12-14 | 2024-12-14 | 1 | AuraIndustries/Aura-MoE-2x4B (Merge) |
AuraIndustries_Aura-MoE-2x4B-v2_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/AuraIndustries/Aura-MoE-2x4B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AuraIndustries/Aura-MoE-2x4B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AuraIndustries__Aura-MoE-2x4B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | AuraIndustries/Aura-MoE-2x4B-v2 | cc78898fad6443ccfe79b956bfde17bd101c15a0 | 17.377412 | apache-2.0 | 1 | 4 | true | true | false | true | 0.924513 | 0.477782 | 47.778228 | 0.431524 | 20.801181 | 0.023414 | 2.34139 | 0.287752 | 5.033557 | 0.410063 | 10.424479 | 0.260971 | 17.885638 | false | false | 2024-12-15 | 2024-12-15 | 1 | AuraIndustries/Aura-MoE-2x4B-v2 (Merge) |
Aurel9_testmerge-7b_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/Aurel9/testmerge-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Aurel9/testmerge-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Aurel9__testmerge-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Aurel9/testmerge-7b | b5f0a72d981b5b2c6bd6294093c6956d88477a3e | 20.994478 | 0 | 7.242 | false | false | false | false | 0.476464 | 0.397998 | 39.799842 | 0.518959 | 32.792793 | 0.067221 | 6.722054 | 0.300336 | 6.711409 | 0.465865 | 17.133073 | 0.305269 | 22.807698 | false | false | 2024-11-16 | 2024-11-16 | 1 | Aurel9/testmerge-7b (Merge) |
|
Azure99_blossom-v5-32b_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Azure99/blossom-v5-32b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Azure99/blossom-v5-32b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Azure99__blossom-v5-32b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Azure99/blossom-v5-32b | ccd4d86e3de01187043683dea1e28df904f7408e | 26.352555 | apache-2.0 | 4 | 32.512 | true | false | false | true | 5.688 | 0.523544 | 52.35442 | 0.595455 | 42.883056 | 0.10423 | 10.422961 | 0.311242 | 8.165548 | 0.402 | 8.35 | 0.423454 | 35.939347 | false | false | 2024-04-29 | 2024-09-21 | 0 | Azure99/blossom-v5-32b |
Azure99_blossom-v5-llama3-8b_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Azure99/blossom-v5-llama3-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Azure99/blossom-v5-llama3-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Azure99__blossom-v5-llama3-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Azure99/blossom-v5-llama3-8b | 91ea35e2e65516988021e4bb3b908e3e497e05c2 | 14.473082 | apache-2.0 | 4 | 8.03 | true | false | false | true | 0.872153 | 0.434293 | 43.429323 | 0.418491 | 18.306535 | 0.043807 | 4.380665 | 0.265101 | 2.013423 | 0.367021 | 5.310938 | 0.220578 | 13.397606 | false | false | 2024-04-20 | 2024-09-21 | 0 | Azure99/blossom-v5-llama3-8b |
Azure99_blossom-v5.1-34b_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Azure99/blossom-v5.1-34b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Azure99/blossom-v5.1-34b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Azure99__blossom-v5.1-34b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Azure99/blossom-v5.1-34b | 2c803204f5dbf4ce37e2df98eb0205cdc53de10d | 28.599286 | apache-2.0 | 5 | 34.389 | true | false | false | true | 9.591483 | 0.569656 | 56.965629 | 0.610911 | 44.147705 | 0.1571 | 15.70997 | 0.309564 | 7.941834 | 0.392792 | 7.298958 | 0.455785 | 39.531619 | false | false | 2024-05-19 | 2024-07-27 | 0 | Azure99/blossom-v5.1-34b |
Azure99_blossom-v5.1-9b_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Azure99/blossom-v5.1-9b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Azure99/blossom-v5.1-9b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Azure99__blossom-v5.1-9b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Azure99/blossom-v5.1-9b | 6044a3dc1e04529fe883aa513d37f266a320d793 | 24.871504 | apache-2.0 | 2 | 8.829 | true | false | false | true | 2.21572 | 0.508582 | 50.858167 | 0.534329 | 34.201244 | 0.116314 | 11.63142 | 0.33557 | 11.409396 | 0.399396 | 8.024479 | 0.397939 | 33.104314 | false | false | 2024-05-15 | 2024-07-24 | 0 | Azure99/blossom-v5.1-9b |
BAAI_Gemma2-9B-IT-Simpo-Infinity-Preference_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/BAAI/Gemma2-9B-IT-Simpo-Infinity-Preference" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Gemma2-9B-IT-Simpo-Infinity-Preference</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Gemma2-9B-IT-Simpo-Infinity-Preference-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BAAI/Gemma2-9B-IT-Simpo-Infinity-Preference | 028a91b1a4f14d365c6db08093b03348455c7bad | 20.984069 | 16 | 9.242 | false | false | false | true | 5.86346 | 0.317638 | 31.763831 | 0.597946 | 42.190844 | 0 | 0 | 0.339765 | 11.96868 | 0.396573 | 8.104948 | 0.386885 | 31.876108 | false | false | 2024-08-28 | 2024-09-05 | 2 | google/gemma-2-9b |
|
BAAI_Infinity-Instruct-3M-0613-Llama3-70B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-3M-0613-Llama3-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-3M-0613-Llama3-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-3M-0613-Llama3-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BAAI/Infinity-Instruct-3M-0613-Llama3-70B | 9fc53668064bdda22975ca72c5a287f8241c95b3 | 34.697075 | apache-2.0 | 5 | 70.554 | true | false | false | true | 10.526907 | 0.682113 | 68.211346 | 0.664161 | 51.327161 | 0.162387 | 16.238671 | 0.358221 | 14.42953 | 0.45226 | 16.532552 | 0.472989 | 41.443189 | false | false | 2024-06-27 | 2024-06-28 | 0 | BAAI/Infinity-Instruct-3M-0613-Llama3-70B |
BAAI_Infinity-Instruct-3M-0613-Mistral-7B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-3M-0613-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-3M-0613-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-3M-0613-Mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BAAI/Infinity-Instruct-3M-0613-Mistral-7B | c7a742e539ec264b9eaeefe2aed29e92e8a7ebd6 | 22.180237 | apache-2.0 | 11 | 7.242 | true | false | false | true | 0.949375 | 0.531987 | 53.198735 | 0.495823 | 28.992936 | 0.074773 | 7.477341 | 0.296141 | 6.152125 | 0.435083 | 13.252083 | 0.316074 | 24.0082 | false | false | 2024-06-21 | 2024-06-27 | 0 | BAAI/Infinity-Instruct-3M-0613-Mistral-7B |
BAAI_Infinity-Instruct-3M-0625-Llama3-70B_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-3M-0625-Llama3-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-3M-0625-Llama3-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-3M-0625-Llama3-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BAAI/Infinity-Instruct-3M-0625-Llama3-70B | 6d8ceada57e55cff3503191adc4d6379ff321fe2 | 36.142217 | apache-2.0 | 3 | 70.554 | true | false | false | true | 10.430955 | 0.744212 | 74.421202 | 0.667034 | 52.028162 | 0.179003 | 17.900302 | 0.357383 | 14.317673 | 0.461656 | 18.340365 | 0.45861 | 39.845597 | false | false | 2024-07-09 | 2024-08-30 | 0 | BAAI/Infinity-Instruct-3M-0625-Llama3-70B |
BAAI_Infinity-Instruct-3M-0625-Llama3-8B_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-3M-0625-Llama3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-3M-0625-Llama3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-3M-0625-Llama3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BAAI/Infinity-Instruct-3M-0625-Llama3-8B | 7be7c0ff1e35c3bb781c47222da99a1724f5f1da | 21.60936 | apache-2.0 | 3 | 8.03 | true | false | false | true | 0.858004 | 0.605027 | 60.502688 | 0.495499 | 28.988222 | 0.061178 | 6.117825 | 0.275168 | 3.355705 | 0.371208 | 5.667708 | 0.325216 | 25.02401 | false | false | 2024-07-09 | 2024-07-13 | 0 | BAAI/Infinity-Instruct-3M-0625-Llama3-8B |
BAAI_Infinity-Instruct-3M-0625-Mistral-7B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-3M-0625-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-3M-0625-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-3M-0625-Mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BAAI/Infinity-Instruct-3M-0625-Mistral-7B | 302e3ae0bcc50dae3fb69fc1b08b518398e8c407 | 22.843425 | apache-2.0 | 3 | 7.242 | true | false | false | true | 0.785797 | 0.586742 | 58.674207 | 0.493967 | 28.823289 | 0.076284 | 7.628399 | 0.286913 | 4.9217 | 0.42724 | 12.238281 | 0.322972 | 24.774675 | false | false | 2024-07-09 | 2024-08-05 | 0 | BAAI/Infinity-Instruct-3M-0625-Mistral-7B |
BAAI_Infinity-Instruct-3M-0625-Qwen2-7B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-3M-0625-Qwen2-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-3M-0625-Qwen2-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-3M-0625-Qwen2-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BAAI/Infinity-Instruct-3M-0625-Qwen2-7B | 503c24156d7682458686a7b5324f7f886e63470d | 24.135357 | apache-2.0 | 8 | 7.616 | true | false | false | true | 1.330078 | 0.555393 | 55.539302 | 0.534591 | 34.656829 | 0.068731 | 6.873112 | 0.312919 | 8.389262 | 0.38876 | 6.461719 | 0.396027 | 32.891918 | false | false | 2024-07-09 | 2024-08-05 | 0 | BAAI/Infinity-Instruct-3M-0625-Qwen2-7B |
BAAI_Infinity-Instruct-3M-0625-Yi-1.5-9B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-3M-0625-Yi-1.5-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-3M-0625-Yi-1.5-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-3M-0625-Yi-1.5-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BAAI/Infinity-Instruct-3M-0625-Yi-1.5-9B | a42c86c61b98ca4fdf238d688fe6ea11cf414d29 | 27.943551 | apache-2.0 | 3 | 8.829 | true | false | false | true | 1.116801 | 0.518598 | 51.859843 | 0.550912 | 35.378707 | 0.151813 | 15.181269 | 0.354027 | 13.870246 | 0.457531 | 16.72474 | 0.411818 | 34.646498 | false | false | 2024-07-09 | 2024-08-05 | 0 | BAAI/Infinity-Instruct-3M-0625-Yi-1.5-9B |
BAAI_Infinity-Instruct-7M-0729-Llama3_1-8B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-7M-0729-Llama3_1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-7M-0729-Llama3_1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-7M-0729-Llama3_1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BAAI/Infinity-Instruct-7M-0729-Llama3_1-8B | 0aca33fd7500a781d041e8bf7e5e3789b03f54f4 | 23.094956 | llama3.1 | 8 | 8.03 | true | false | false | true | 0.866805 | 0.613195 | 61.319521 | 0.507734 | 30.888805 | 0.106495 | 10.649547 | 0.292785 | 5.704698 | 0.357844 | 5.297135 | 0.32239 | 24.710033 | false | false | 2024-08-02 | 2024-08-05 | 0 | BAAI/Infinity-Instruct-7M-0729-Llama3_1-8B |
BAAI_Infinity-Instruct-7M-0729-mistral-7B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-7M-0729-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-7M-0729-mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-7M-0729-mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BAAI/Infinity-Instruct-7M-0729-mistral-7B | 36651591cb13346ecbde23832013e024029700fa | 22.914335 | apache-2.0 | 4 | 7.242 | true | false | false | true | 0.799261 | 0.616193 | 61.619281 | 0.496381 | 28.697915 | 0.064955 | 6.495468 | 0.290268 | 5.369128 | 0.406188 | 10.040104 | 0.327377 | 25.264111 | false | false | 2024-07-25 | 2024-08-05 | 0 | BAAI/Infinity-Instruct-7M-0729-mistral-7B |
BAAI_Infinity-Instruct-7M-Gen-Llama3_1-70B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-7M-Gen-Llama3_1-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-7M-Gen-Llama3_1-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-7M-Gen-Llama3_1-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BAAI/Infinity-Instruct-7M-Gen-Llama3_1-70B | 1ef63c4993a8c723c9695c827295c17080a64435 | 37.10681 | llama3.1 | 17 | 70.554 | true | false | false | true | 11.069121 | 0.733546 | 73.354588 | 0.66952 | 52.498947 | 0.229607 | 22.960725 | 0.375839 | 16.778523 | 0.453906 | 16.971615 | 0.460688 | 40.076463 | false | false | 2024-07-25 | 2024-09-26 | 0 | BAAI/Infinity-Instruct-7M-Gen-Llama3_1-70B |
BAAI_Infinity-Instruct-7M-Gen-Llama3_1-8B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-7M-Gen-Llama3_1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-7M-Gen-Llama3_1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-7M-Gen-Llama3_1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BAAI/Infinity-Instruct-7M-Gen-Llama3_1-8B | 56f9c2845ae024eb8b1dd9ea0d8891cbaf33c596 | 23.094956 | llama3.1 | 8 | 8.03 | true | false | false | true | 0.91714 | 0.613195 | 61.319521 | 0.507734 | 30.888805 | 0.106495 | 10.649547 | 0.292785 | 5.704698 | 0.357844 | 5.297135 | 0.32239 | 24.710033 | false | false | 2024-08-02 | 2024-08-29 | 0 | BAAI/Infinity-Instruct-7M-Gen-Llama3_1-8B |
BAAI_Infinity-Instruct-7M-Gen-mistral-7B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-7M-Gen-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-7M-Gen-mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-7M-Gen-mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BAAI/Infinity-Instruct-7M-Gen-mistral-7B | 82c83d670a8954f4250547b53a057dea1fbd460d | 22.888939 | apache-2.0 | 4 | 7.242 | true | false | false | true | 0.824635 | 0.614669 | 61.466908 | 0.496381 | 28.697915 | 0.064955 | 6.495468 | 0.290268 | 5.369128 | 0.406188 | 10.040104 | 0.327377 | 25.264111 | false | false | 2024-07-25 | 2024-08-29 | 0 | BAAI/Infinity-Instruct-7M-Gen-mistral-7B |
BAAI_OPI-Llama-3.1-8B-Instruct_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/BAAI/OPI-Llama-3.1-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/OPI-Llama-3.1-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__OPI-Llama-3.1-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BAAI/OPI-Llama-3.1-8B-Instruct | 48504799d009b4e1b29e6d2948a7cde68acdc3b0 | 8.305018 | llama3.1 | 1 | 8.03 | true | false | false | true | 0.671657 | 0.207455 | 20.745511 | 0.355122 | 9.768712 | 0 | 0 | 0.274329 | 3.243848 | 0.323302 | 3.579427 | 0.212434 | 12.492612 | false | false | 2024-09-06 | 2024-09-21 | 2 | meta-llama/Meta-Llama-3.1-8B |
BEE-spoke-data_Meta-Llama-3-8Bee_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/BEE-spoke-data/Meta-Llama-3-8Bee" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/Meta-Llama-3-8Bee</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__Meta-Llama-3-8Bee-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BEE-spoke-data/Meta-Llama-3-8Bee | 8143e34e77a49a30ec2617c5c9cc22cb3cda2287 | 14.544519 | llama3 | 0 | 8.03 | true | false | false | false | 0.83038 | 0.195066 | 19.506576 | 0.462636 | 24.199033 | 0.041541 | 4.154079 | 0.313758 | 8.501119 | 0.365406 | 6.242448 | 0.321975 | 24.663859 | false | false | 2024-04-28 | 2024-07-04 | 1 | meta-llama/Meta-Llama-3-8B |
BEE-spoke-data_smol_llama-101M-GQA_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/BEE-spoke-data/smol_llama-101M-GQA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/smol_llama-101M-GQA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__smol_llama-101M-GQA-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BEE-spoke-data/smol_llama-101M-GQA | bb26643db413bada7e0c3c50752bf9da82403dba | 3.918895 | apache-2.0 | 28 | 0.101 | true | false | false | false | 0.119606 | 0.138437 | 13.843712 | 0.301756 | 3.198004 | 0 | 0 | 0.25755 | 1.006711 | 0.371271 | 4.275521 | 0.110705 | 1.189421 | false | false | 2023-10-26 | 2024-07-06 | 0 | BEE-spoke-data/smol_llama-101M-GQA |
BEE-spoke-data_smol_llama-220M-GQA_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/BEE-spoke-data/smol_llama-220M-GQA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/smol_llama-220M-GQA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__smol_llama-220M-GQA-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BEE-spoke-data/smol_llama-220M-GQA | 8845b1d3c0bc73522ef2700aab467183cbdca9f7 | 6.401567 | apache-2.0 | 12 | 0.218 | true | false | false | false | 0.163613 | 0.238605 | 23.860468 | 0.303167 | 3.037843 | 0 | 0 | 0.255872 | 0.782998 | 0.405875 | 9.067708 | 0.114943 | 1.660387 | false | false | 2023-12-22 | 2024-06-26 | 0 | BEE-spoke-data/smol_llama-220M-GQA |
BEE-spoke-data_smol_llama-220M-GQA-fineweb_edu_bfloat16 | bfloat16 | 🟩 continuously pretrained | 🟩 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/BEE-spoke-data/smol_llama-220M-GQA-fineweb_edu" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/smol_llama-220M-GQA-fineweb_edu</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__smol_llama-220M-GQA-fineweb_edu-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BEE-spoke-data/smol_llama-220M-GQA-fineweb_edu | dec16b41d5e94070dbc1f8449a554373fd4cc1d1 | 6.516558 | apache-2.0 | 1 | 0.218 | true | false | false | false | 0.161876 | 0.198812 | 19.881248 | 0.292905 | 2.314902 | 0 | 0 | 0.259228 | 1.230425 | 0.43676 | 14.261719 | 0.112699 | 1.411052 | false | false | 2024-06-08 | 2024-06-26 | 1 | BEE-spoke-data/smol_llama-220M-GQA |
BEE-spoke-data_smol_llama-220M-openhermes_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/BEE-spoke-data/smol_llama-220M-openhermes" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/smol_llama-220M-openhermes</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__smol_llama-220M-openhermes-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BEE-spoke-data/smol_llama-220M-openhermes | fb4bcd4b7eee363baacb4176a26cea2aaeb173f4 | 4.761772 | apache-2.0 | 5 | 0.218 | true | false | false | false | 0.154426 | 0.155523 | 15.55229 | 0.302752 | 3.107692 | 0 | 0 | 0.267617 | 2.348993 | 0.384729 | 6.224479 | 0.112035 | 1.337175 | false | false | 2023-12-30 | 2024-09-21 | 1 | BEE-spoke-data/smol_llama-220M-GQA |
BEE-spoke-data_tFINE-900m-e16-d32-flan_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | T5ForConditionalGeneration | <a target="_blank" href="https://huggingface.co/BEE-spoke-data/tFINE-900m-e16-d32-flan" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/tFINE-900m-e16-d32-flan</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__tFINE-900m-e16-d32-flan-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BEE-spoke-data/tFINE-900m-e16-d32-flan | d9ffec9798402d13d8f2c56ec3de3ad092445297 | 4.433887 | apache-2.0 | 0 | 0.887 | true | false | false | false | 2.456006 | 0.150577 | 15.057714 | 0.302804 | 4.411894 | 0 | 0 | 0.233221 | 0 | 0.372417 | 3.71875 | 0.130735 | 3.414967 | false | false | 2024-09-06 | 2024-09-13 | 1 | pszemraj/tFINE-900m-e16-d32-1024ctx |
BEE-spoke-data_tFINE-900m-e16-d32-flan-infinity-instruct-7m-T2T_en-1024_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | T5ForConditionalGeneration | <a target="_blank" href="https://huggingface.co/BEE-spoke-data/tFINE-900m-e16-d32-flan-infinity-instruct-7m-T2T_en-1024" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/tFINE-900m-e16-d32-flan-infinity-instruct-7m-T2T_en-1024</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__tFINE-900m-e16-d32-flan-infinity-instruct-7m-T2T_en-1024-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BEE-spoke-data/tFINE-900m-e16-d32-flan-infinity-instruct-7m-T2T_en-1024 | b1e2f12f5224be9f7da0cb5ff30e1bbb3f10f6ca | 5.823653 | apache-2.0 | 0 | 0.887 | true | false | false | false | 2.600608 | 0.132067 | 13.206736 | 0.313779 | 4.737018 | 0 | 0 | 0.254195 | 0.559284 | 0.439271 | 13.808854 | 0.12367 | 2.630024 | false | false | 2024-09-10 | 2024-09-14 | 2 | pszemraj/tFINE-900m-e16-d32-1024ctx |
BEE-spoke-data_tFINE-900m-e16-d32-instruct_2e_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | T5ForConditionalGeneration | <a target="_blank" href="https://huggingface.co/BEE-spoke-data/tFINE-900m-e16-d32-instruct_2e" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/tFINE-900m-e16-d32-instruct_2e</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__tFINE-900m-e16-d32-instruct_2e-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BEE-spoke-data/tFINE-900m-e16-d32-instruct_2e | 4c626138c9f4e0c3eafe74b2755eb89334c7ca59 | 5.681552 | apache-2.0 | 0 | 0.887 | true | false | false | false | 2.516619 | 0.140286 | 14.028555 | 0.313457 | 5.01307 | 0 | 0 | 0.259228 | 1.230425 | 0.420698 | 11.18724 | 0.12367 | 2.630024 | false | false | 2024-09-17 | 2024-09-22 | 3 | pszemraj/tFINE-900m-e16-d32-1024ctx |
BEE-spoke-data_tFINE-900m-instruct-orpo_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | T5ForConditionalGeneration | <a target="_blank" href="https://huggingface.co/BEE-spoke-data/tFINE-900m-instruct-orpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/tFINE-900m-instruct-orpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__tFINE-900m-instruct-orpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BEE-spoke-data/tFINE-900m-instruct-orpo | e0a21c79bac74442252d36e2c01403afa3f0971b | 3.431957 | apache-2.0 | 0 | 0.887 | true | false | false | true | 2.574962 | 0.132992 | 13.299157 | 0.302209 | 3.267301 | 0 | 0 | 0.259228 | 1.230425 | 0.340854 | 1.106771 | 0.115193 | 1.688091 | false | false | 2024-09-22 | 2024-09-23 | 0 | BEE-spoke-data/tFINE-900m-instruct-orpo |
BSC-LT_salamandra-7b_float16 | float16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/BSC-LT/salamandra-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BSC-LT/salamandra-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BSC-LT__salamandra-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BSC-LT/salamandra-7b | bf30739316ceac4b624583a27ec96dfc401179e8 | 5.641971 | apache-2.0 | 14 | 7.768 | true | false | false | false | 0.189289 | 0.136738 | 13.67383 | 0.351661 | 10.157422 | 0 | 0 | 0.270134 | 2.684564 | 0.350094 | 1.861719 | 0.149269 | 5.474291 | false | false | 2024-09-30 | 2024-11-22 | 0 | BSC-LT/salamandra-7b |
BSC-LT_salamandra-7b-instruct_float16 | float16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/BSC-LT/salamandra-7b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BSC-LT/salamandra-7b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BSC-LT__salamandra-7b-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BSC-LT/salamandra-7b-instruct | 77ddccbc7d9f9ffd55a8535365e8eebc493ccb8e | 10.080539 | apache-2.0 | 35 | 7.768 | true | false | false | true | 1.147504 | 0.245074 | 24.507418 | 0.385132 | 14.688129 | 0.002266 | 0.226586 | 0.264262 | 1.901566 | 0.413437 | 10.213021 | 0.180519 | 8.946513 | false | false | 2024-09-30 | 2024-11-22 | 1 | BSC-LT/salamandra-7b-instruct (Merge) |
Ba2han_Llama-Phi-3_DoRA_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/Ba2han/Llama-Phi-3_DoRA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Ba2han/Llama-Phi-3_DoRA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Ba2han__Llama-Phi-3_DoRA-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Ba2han/Llama-Phi-3_DoRA | 36f99064a7be8ba475c2ee5c5424e95c263ccb87 | 25.318838 | mit | 6 | 3.821 | true | false | false | true | 0.533136 | 0.513053 | 51.305314 | 0.551456 | 37.249164 | 0.112538 | 11.253776 | 0.326342 | 10.178971 | 0.406927 | 9.532552 | 0.391539 | 32.393248 | false | false | 2024-05-15 | 2024-06-26 | 0 | Ba2han/Llama-Phi-3_DoRA |
BenevolenceMessiah_Qwen2.5-72B-2x-Instruct-TIES-v1.0_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/BenevolenceMessiah/Qwen2.5-72B-2x-Instruct-TIES-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BenevolenceMessiah/Qwen2.5-72B-2x-Instruct-TIES-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BenevolenceMessiah__Qwen2.5-72B-2x-Instruct-TIES-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BenevolenceMessiah/Qwen2.5-72B-2x-Instruct-TIES-v1.0 | 459891ec78c9bbed2836a8bba706e1707db10231 | 34.185749 | 1 | 72.7 | false | false | false | true | 17.350892 | 0.54735 | 54.734992 | 0.727311 | 61.911495 | 0.093656 | 9.365559 | 0.36745 | 15.659955 | 0.420667 | 12.016667 | 0.562832 | 51.425827 | false | false | 2024-11-11 | 2024-11-24 | 1 | BenevolenceMessiah/Qwen2.5-72B-2x-Instruct-TIES-v1.0 (Merge) |
|
BenevolenceMessiah_Yi-Coder-9B-Chat-Instruct-TIES-MoE-v1.0_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/BenevolenceMessiah/Yi-Coder-9B-Chat-Instruct-TIES-MoE-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BenevolenceMessiah/Yi-Coder-9B-Chat-Instruct-TIES-MoE-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BenevolenceMessiah__Yi-Coder-9B-Chat-Instruct-TIES-MoE-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BenevolenceMessiah/Yi-Coder-9B-Chat-Instruct-TIES-MoE-v1.0 | d90f6e36584dc9b367461701e83c833bdeb736f2 | 15.096268 | apache-2.0 | 0 | 28.309 | true | true | false | false | 3.334797 | 0.301153 | 30.115316 | 0.490867 | 26.877991 | 0.043051 | 4.305136 | 0.262584 | 1.677852 | 0.407979 | 8.930729 | 0.268035 | 18.670582 | true | false | 2024-09-21 | 2024-09-22 | 1 | BenevolenceMessiah/Yi-Coder-9B-Chat-Instruct-TIES-MoE-v1.0 (Merge) |
BlackBeenie_Bloslain-8B-v0.2_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/BlackBeenie/Bloslain-8B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/Bloslain-8B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__Bloslain-8B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BlackBeenie/Bloslain-8B-v0.2 | ebcb7f9f30bc172523a827d1ddefeb52b1aba494 | 23.803914 | 1 | 8.03 | false | false | false | false | 0.691763 | 0.502337 | 50.233713 | 0.511088 | 30.662902 | 0.145015 | 14.501511 | 0.306208 | 7.494407 | 0.407573 | 10.446615 | 0.365359 | 29.484338 | false | false | 2024-11-19 | 2024-11-19 | 1 | BlackBeenie/Bloslain-8B-v0.2 (Merge) |
|
BlackBeenie_Llama-3.1-8B-pythonic-passthrough-merge_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/BlackBeenie/Llama-3.1-8B-pythonic-passthrough-merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/Llama-3.1-8B-pythonic-passthrough-merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__Llama-3.1-8B-pythonic-passthrough-merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BlackBeenie/Llama-3.1-8B-pythonic-passthrough-merge | 3ec46616f5b34821b3b928938931295f92e49213 | 7.311462 | 0 | 20.245 | false | false | false | false | 3.58329 | 0.231586 | 23.158553 | 0.345385 | 9.359905 | 0.006042 | 0.60423 | 0.268456 | 2.46085 | 0.377812 | 4.593229 | 0.133228 | 3.692007 | false | false | 2024-11-06 | 2024-11-06 | 1 | BlackBeenie/Llama-3.1-8B-pythonic-passthrough-merge (Merge) |
|
BlackBeenie_Neos-Gemma-2-9b_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/BlackBeenie/Neos-Gemma-2-9b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/Neos-Gemma-2-9b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__Neos-Gemma-2-9b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BlackBeenie/Neos-Gemma-2-9b | 56dbbb4f972be887e5b57311a8a32e148e98d154 | 25.211313 | apache-2.0 | 1 | 9.242 | true | false | false | true | 2.679092 | 0.587567 | 58.756655 | 0.550298 | 35.638851 | 0.082326 | 8.232628 | 0.322987 | 9.731544 | 0.36175 | 5.785417 | 0.398105 | 33.122784 | false | false | 2024-11-11 | 2024-11-11 | 1 | BlackBeenie/Neos-Gemma-2-9b (Merge) |
BlackBeenie_Neos-Llama-3.1-8B_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/BlackBeenie/Neos-Llama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/Neos-Llama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__Neos-Llama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BlackBeenie/Neos-Llama-3.1-8B | 9b48520ec1a777be0f1fd88f95454d85ac568407 | 19.461825 | apache-2.0 | 1 | 8.03 | true | false | false | true | 0.793867 | 0.494394 | 49.439376 | 0.4425 | 21.080123 | 0.129154 | 12.915408 | 0.268456 | 2.46085 | 0.37499 | 5.740365 | 0.326213 | 25.134826 | false | false | 2024-11-12 | 2024-11-12 | 1 | BlackBeenie/Neos-Llama-3.1-8B (Merge) |
BlackBeenie_Neos-Llama-3.1-base_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/BlackBeenie/Neos-Llama-3.1-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/Neos-Llama-3.1-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__Neos-Llama-3.1-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BlackBeenie/Neos-Llama-3.1-base | d4af4d73ba5fea0275fd1e3ba5102a79ac8009db | 3.968795 | 0 | 4.65 | false | false | false | true | 1.409285 | 0.175082 | 17.508212 | 0.293034 | 2.221447 | 0 | 0 | 0.237416 | 0 | 0.349906 | 2.838281 | 0.111203 | 1.244829 | false | false | 2024-11-11 | 2024-11-11 | 0 | BlackBeenie/Neos-Llama-3.1-base |
|
BlackBeenie_Neos-Phi-3-14B-v0.1_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Phi3ForCausalLM | <a target="_blank" href="https://huggingface.co/BlackBeenie/Neos-Phi-3-14B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/Neos-Phi-3-14B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__Neos-Phi-3-14B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BlackBeenie/Neos-Phi-3-14B-v0.1 | 0afb7cc74a94f11f2695dc92788cdc6e28325f9c | 26.843485 | apache-2.0 | 0 | 13.96 | true | false | false | true | 0.909626 | 0.402245 | 40.224493 | 0.621193 | 46.631387 | 0.166918 | 16.691843 | 0.305369 | 7.38255 | 0.412542 | 10.534375 | 0.456366 | 39.596262 | false | false | 2024-11-27 | 2024-11-27 | 1 | BlackBeenie/Neos-Phi-3-14B-v0.1 (Merge) |
BlackBeenie_llama-3-luminous-merged_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/BlackBeenie/llama-3-luminous-merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/llama-3-luminous-merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__llama-3-luminous-merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BlackBeenie/llama-3-luminous-merged | 64288dd8e3305f2dc11d84fe0c653f351b2e8a9d | 21.480108 | 0 | 8.03 | false | false | false | false | 0.763854 | 0.432345 | 43.234507 | 0.515392 | 30.643687 | 0.07855 | 7.854985 | 0.292785 | 5.704698 | 0.414896 | 10.628646 | 0.377327 | 30.814125 | false | false | 2024-09-15 | 2024-10-11 | 1 | BlackBeenie/llama-3-luminous-merged (Merge) |
|
BlackBeenie_llama-3.1-8B-Galore-openassistant-guanaco_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/BlackBeenie/llama-3.1-8B-Galore-openassistant-guanaco" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/llama-3.1-8B-Galore-openassistant-guanaco</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__llama-3.1-8B-Galore-openassistant-guanaco-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BlackBeenie/llama-3.1-8B-Galore-openassistant-guanaco | 828fa03c10e9085700b7abbe26f95067fab010fd | 18.072101 | 1 | 8.03 | false | false | false | false | 0.85682 | 0.263484 | 26.348422 | 0.521337 | 31.444705 | 0.048338 | 4.833837 | 0.300336 | 6.711409 | 0.440625 | 14.578125 | 0.320645 | 24.516105 | false | false | 2024-10-16 | 2024-10-19 | 0 | BlackBeenie/llama-3.1-8B-Galore-openassistant-guanaco |
|
Bllossom_llama-3.2-Korean-Bllossom-AICA-5B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MllamaForConditionalGeneration | <a target="_blank" href="https://huggingface.co/Bllossom/llama-3.2-Korean-Bllossom-AICA-5B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Bllossom/llama-3.2-Korean-Bllossom-AICA-5B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Bllossom__llama-3.2-Korean-Bllossom-AICA-5B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Bllossom/llama-3.2-Korean-Bllossom-AICA-5B | 4672b7de38c2cc390b146d6b6ce7a6dd295d8a0e | 18.169448 | llama3.2 | 36 | 5 | true | false | false | true | 0.610118 | 0.51725 | 51.724979 | 0.429307 | 18.650223 | 0.073263 | 7.326284 | 0.298658 | 6.487696 | 0.383396 | 5.824479 | 0.271027 | 19.003029 | false | false | 2024-12-12 | 2024-12-16 | 0 | Bllossom/llama-3.2-Korean-Bllossom-AICA-5B |