File size: 901 Bytes
6b4f806 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 |
---
language:
- "lzh"
tags:
- "classical chinese"
- "literary chinese"
- "ancient chinese"
- "masked-lm"
license: "apache-2.0"
pipeline_tag: "fill-mask"
mask_token: "[MASK]"
widget:
- text: "孟子[MASK]梁惠王"
---
# modernbert-large-classical-chinese-traditional
## Model Description
This is a ModernBERT model pre-trained on [Kanripo](https://www.kanripo.org) texts. NVIDIA A100-SXM4-40GB×8 took 24 hours 16 minutes for training. You can fine-tune `modernbert-large-classical-chinese-traditional` for downstream tasks, such as sentence-segmentation, POS-tagging, dependency-parsing, and so on.
## How to Use
```py
from transformers import AutoTokenizer,AutoModelForMaskedLM
tokenizer=AutoTokenizer.from_pretrained("KoichiYasuoka/modernbert-large-classical-chinese-traditional")
model=AutoModelForMaskedLM.from_pretrained("KoichiYasuoka/modernbert-large-classical-chinese-traditional")
```
|