polish-roberta-large-v2

An encoder model based on the RoBERTa architecture, pre-trained on a large corpus of Polish texts. More information can be found in our GitHub repository and in the publication Pre-training polish transformer-based language models at scale.

Citation

@inproceedings{dadas2020pre,
  title={Pre-training polish transformer-based language models at scale},
  author={Dadas, S{\l}awomir and Pere{\l}kiewicz, Micha{\l} and Po{\'s}wiata, Rafa{\l}},
  booktitle={International Conference on Artificial Intelligence and Soft Computing},
  pages={301--314},
  year={2020},
  organization={Springer}
}
Downloads last month
218
Safetensors
Model size
435M params
Tensor type
I64
·
F32
·
Inference Providers NEW
Examples
Mask token: <mask>

Model tree for sdadas/polish-roberta-large-v2

Finetunes
1 model

Space using sdadas/polish-roberta-large-v2 1

Collection including sdadas/polish-roberta-large-v2