Update README.md
Browse files
README.md
CHANGED
@@ -7,7 +7,8 @@ language:
|
|
7 |
|
8 |
# oyo-bert-base
|
9 |
|
10 |
-
OYO-BERT (or Oyo-dialect of Yoruba BERT) was created by pre-training a [BERT model with token dropping](https://aclanthology.org/2022.acl-long.262/) on Yoruba language texts for about 100K steps.
|
|
|
11 |
|
12 |
### Pre-training corpus
|
13 |
A mix of WURA, Wikipedia and MT560 Yoruba data
|
|
|
7 |
|
8 |
# oyo-bert-base
|
9 |
|
10 |
+
OYO-BERT (or Oyo-dialect of Yoruba BERT) was created by pre-training a [BERT model with token dropping](https://aclanthology.org/2022.acl-long.262/) on Yoruba language texts for about 100K steps.
|
11 |
+
It was trained using BERT-base architecture with [Tensorflow Model Garden](https://github.com/tensorflow/models/tree/master/official/projects)
|
12 |
|
13 |
### Pre-training corpus
|
14 |
A mix of WURA, Wikipedia and MT560 Yoruba data
|