- Developed by: datatab
- License: mit
π Results
Results obtained through the Serbian LLM Evaluation Benchmark
MODEL | ARC-E | ARC-C | Hellaswag | PiQA | Winogrande | BoolQ | OpenbookQA | OZ_EVAL | SCORE |
---|---|---|---|---|---|---|---|---|---|
YugoGPT-Florida | 0.6918 | 0.5766 | 0.4037 | 0.7374 | 0.5782 | 0.8685 | 0.5918 | 0.7407 | 64,85875 |
Yugo55A-GPT | 0.5846 | 0.5185 | 0.3686 | 0.7076 | 0.5277 | 0.8584 | 0.5485 | 0.6883 | 60,0275 |
Yugo60-GPT | 0.4948 | 0.4542 | 0.3342 | 0.6897 | 0.5138 | 0.8212 | 0.5155 | 0.6379 | 55,76625 |
Yugo45-GPT | 0.4049 | 0.3900 | 0.2812 | 0.6055 | 0.4992 | 0.5793 | 0.4433 | 0.6111 | 47,68125 |
ποΈ Training Stats
π» Usage
*** Released with permission by datatab *** - GGUF quantized by @MarkoRadojcic
π‘ Contributions Welcome!
Have ideas, bug fixes, or want to add a custom model? We'd love for you to be part of the journey! Contributions help grow and enhance the capabilities of the YugoGPT-Florida.
π Citation
Thanks for using YugoGPT-Florida β where language learning models meet Serbian precision and creativity! Let's build smarter models together. ποΏ½
If you find this model useful in your research, please cite it as follows:
@article{YugoGPT-Florida},
title={YugoGPT-Florida},
author={datatab},
year={2024},
url={https://huggingface.co/datatab/YugoGPT-Florida}
}

- Downloads last month
- 28
Hardware compatibility
Log In
to view the estimation
4-bit
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
π
Ask for provider support
HF Inference deployability: The model has no library tag.