ROUBERTa cased
This is a RoBERTa-base LM trained from scratch exclusively on Uruguayan press [1].
Cite this work
@inproceedings{rouberta2024,
title={A Language Model Trained on Uruguayan Spanish News Text},
author={Filevich, Juan Pablo and Marco, Gonzalo and Castro, Santiago and Chiruzzo, Luis and Ros{\'a}, Aiala},
booktitle={Proceedings of the Second International Workshop Towards Digital Language Equality (TDLE): Focusing on Sustainability@ LREC-COLING 2024},
pages={53--60},
year={2024}
}
- Downloads last month
- 24
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.