Chichewa W2v-BERT 2.0 Models
Collection
6 items
โข
Updated
This model is a fine-tuned version of facebook/w2v-bert-2.0 on the CLEAR-GLOBAL/CHICHEWA_34_68H - NA dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
2.5711 | 1.1038 | 1000 | 2.8158 | 0.9906 | 0.8586 |
0.1101 | 2.2076 | 2000 | 0.6337 | 0.5731 | 0.1691 |
0.0696 | 3.3114 | 3000 | 0.4766 | 0.5119 | 0.1419 |
0.0329 | 4.4152 | 4000 | 0.4343 | 0.4797 | 0.1356 |
0.0304 | 5.5191 | 5000 | 0.3731 | 0.4476 | 0.1293 |
0.0292 | 6.6229 | 6000 | 0.3978 | 0.4217 | 0.1220 |
0.0644 | 7.7267 | 7000 | 0.3462 | 0.4160 | 0.1206 |
0.2232 | 8.8305 | 8000 | 0.3415 | 0.4014 | 0.1167 |
0.0166 | 9.9343 | 9000 | 0.3268 | 0.4064 | 0.1222 |
0.0216 | 11.0375 | 10000 | 0.3579 | 0.4225 | 0.1224 |
0.0104 | 12.1414 | 11000 | 0.3526 | 0.4144 | 0.1208 |
0.0079 | 13.2452 | 12000 | 0.2968 | 0.3613 | 0.1070 |
0.0116 | 14.3490 | 13000 | 0.3053 | 0.3904 | 0.1127 |
0.0284 | 15.4528 | 14000 | 0.3216 | 0.3644 | 0.1069 |
0.0838 | 16.5566 | 15000 | 0.2916 | 0.3803 | 0.1074 |
0.007 | 17.6604 | 16000 | 0.3104 | 0.3732 | 0.1064 |
0.0039 | 18.7642 | 17000 | 0.3043 | 0.3715 | 0.1079 |
0.0084 | 19.8680 | 18000 | 0.3325 | 0.3903 | 0.1114 |
0.0147 | 20.9718 | 19000 | 0.3620 | 0.3906 | 0.1095 |
0.0083 | 22.0751 | 20000 | 0.3416 | 0.4119 | 0.1123 |
Base model
facebook/w2v-bert-2.0