Hausa W2v-BERT 2.0 Models
Collection
11 items
โข
Updated
This model is a fine-tuned version of facebook/w2v-bert-2.0 on the CLEAR-GLOBAL/HAUSA_250_250H_YOURTTS - NA dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
0.5843 | 0.7225 | 1000 | 0.4228 | 0.4167 | 0.2123 |
0.2847 | 1.4451 | 2000 | 0.3274 | 0.3731 | 0.2006 |
0.0642 | 2.1676 | 3000 | 0.2891 | 0.3553 | 0.1963 |
0.082 | 2.8902 | 4000 | 0.2788 | 0.3512 | 0.1944 |
0.2373 | 3.6127 | 5000 | 0.2654 | 0.3462 | 0.1927 |
0.186 | 4.3353 | 6000 | 0.2633 | 0.3464 | 0.1930 |
0.0351 | 5.0578 | 7000 | 0.2603 | 0.3416 | 0.1921 |
0.0405 | 5.7803 | 8000 | 0.2670 | 0.3467 | 0.1932 |
0.1905 | 6.5029 | 9000 | 0.2480 | 0.3350 | 0.1904 |
0.1314 | 7.2254 | 10000 | 0.2616 | 0.3349 | 0.1907 |
0.1442 | 7.9480 | 11000 | 0.2596 | 0.3376 | 0.1914 |
0.0301 | 8.6705 | 12000 | 0.2439 | 0.3336 | 0.1897 |
0.1593 | 9.3931 | 13000 | 0.2410 | 0.3342 | 0.1898 |
0.1037 | 10.1156 | 14000 | 0.2461 | 0.3331 | 0.1905 |
0.0634 | 10.8382 | 15000 | 0.2598 | 0.3303 | 0.1896 |
0.0555 | 11.5607 | 16000 | 0.2400 | 0.3328 | 0.1899 |
0.1657 | 12.2832 | 17000 | 0.2437 | 0.3318 | 0.1896 |
0.0588 | 13.0058 | 18000 | 0.2386 | 0.3334 | 0.1904 |
0.058 | 13.7283 | 19000 | 0.2453 | 0.3294 | 0.1888 |
0.1711 | 14.4509 | 20000 | 0.2383 | 0.3317 | 0.1898 |
0.1441 | 15.1734 | 21000 | 0.2432 | 0.3324 | 0.1900 |
0.1502 | 15.8960 | 22000 | 0.2403 | 0.3287 | 0.1891 |
0.0333 | 16.6185 | 23000 | 0.2399 | 0.3313 | 0.1895 |
0.1604 | 17.3410 | 24000 | 0.2430 | 0.3273 | 0.1887 |
0.1162 | 18.0636 | 25000 | 0.2410 | 0.3277 | 0.1881 |
Base model
facebook/w2v-bert-2.0