PuoBERTa
Collection
10 items
•
Updated
🤗 https://huggingface.co/dsfsi/PuoBERTa
Give Feedback 📑: DSFSI Resource Feedback Form{:target="_blank"}
A Roberta-based language model finetuned for News Categorisation.
Based on https://huggingface.co/dsfsi/PuoBERTa
This is a News Categorisation model for Setswana.
We use the IPTC news codes https://iptc.org/standards/newscodes/
Training, Dev and Validation dataset https://huggingface.co/datasets/dsfsi/daily-news-dikgang.
Performance of models on Daily News Dikgang dataset
Model | 5-fold Cross Validation F1 | Test F1 |
---|---|---|
Logistic Regression + TFIDF | 60.1 | 56.2 |
NCHLT TSN RoBERTa | 64.7 | 60.3 |
PuoBERTa | 63.8 | 62.9 |
PuoBERTaJW300 | 66.2 | 65.4 |
Use this model for Part of text classification for Setswana.
Bibtex Reference
@inproceedings{marivate2023puoberta,
title = {PuoBERTa: Training and evaluation of a curated language model for Setswana},
author = {Vukosi Marivate and Moseli Mots'Oehli and Valencia Wagner and Richard Lastrucci and Isheanesu Dzingirai},
year = {2023},
booktitle= {Artificial Intelligence Research. SACAIR 2023. Communications in Computer and Information Science},
url= {https://link.springer.com/chapter/10.1007/978-3-031-49002-6_17},
keywords = {NLP},
preprint_url = {https://arxiv.org/abs/2310.09141},
dataset_url = {https://github.com/dsfsi/PuoBERTa},
software_url = {https://huggingface.co/dsfsi/PuoBERTa}
}
Your contributions are welcome! Feel free to improve the model.
Vukosi Marivate
For more details, reach out or check our website.
Email: [email protected]
Enjoy exploring Setswana through AI!