Hausa W2v-BERT 2.0 Models
Collection
6 items
โข
Updated
This model is a fine-tuned version of facebook/w2v-bert-2.0 on the CLEAR-GLOBAL/HAUSA_579_450H - NA dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
0.6282 | 0.3057 | 1000 | 0.4477 | 0.4212 | 0.2132 |
0.5207 | 0.6114 | 2000 | 0.3286 | 0.3757 | 0.2016 |
0.3766 | 0.9172 | 3000 | 0.2949 | 0.3559 | 0.1961 |
0.0778 | 1.2229 | 4000 | 0.2759 | 0.3559 | 0.1956 |
0.0831 | 1.5286 | 5000 | 0.2665 | 0.3554 | 0.1942 |
0.0805 | 1.8343 | 6000 | 0.2674 | 0.3433 | 0.1920 |
0.0662 | 2.1400 | 7000 | 0.2634 | 0.3459 | 0.1936 |
0.0828 | 2.4457 | 8000 | 0.2634 | 0.3473 | 0.1949 |
0.0627 | 2.7515 | 9000 | 0.2548 | 0.3413 | 0.1920 |
0.0701 | 3.0572 | 10000 | 0.2533 | 0.3391 | 0.1914 |
0.0698 | 3.3629 | 11000 | 0.2427 | 0.3347 | 0.1903 |
0.0859 | 3.6686 | 12000 | 0.2374 | 0.3326 | 0.1892 |
0.0925 | 3.9743 | 13000 | 0.2341 | 0.3320 | 0.1891 |
0.1522 | 4.2800 | 14000 | 0.2414 | 0.3309 | 0.1895 |
0.1586 | 4.5858 | 15000 | 0.2405 | 0.3335 | 0.1894 |
0.1545 | 4.8915 | 16000 | 0.2311 | 0.3307 | 0.1892 |
0.1721 | 5.1972 | 17000 | 0.2306 | 0.3304 | 0.1893 |
0.1974 | 5.5029 | 18000 | 0.2396 | 0.3332 | 0.1899 |
0.1789 | 5.8086 | 19000 | 0.2285 | 0.3242 | 0.1878 |
0.2498 | 6.1143 | 20000 | 0.2290 | 0.3291 | 0.1887 |
0.1528 | 6.4201 | 21000 | 0.2342 | 0.3316 | 0.1887 |
0.159 | 6.7258 | 22000 | 0.2310 | 0.3243 | 0.1873 |
0.0633 | 7.0315 | 23000 | 0.2312 | 0.3255 | 0.1881 |
0.0843 | 7.3372 | 24000 | 0.2300 | 0.3287 | 0.1886 |
Base model
facebook/w2v-bert-2.0