bambara-asr
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the SUDOPING01/MALIAN-LANGUAGES-DATASET - BAMBARA dataset. It achieves the following results on the evaluation set:
- Loss: 0.0916
- Wer: 0.0909
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0005
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- num_devices: 2
- total_train_batch_size: 16
- total_eval_batch_size: 16
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 15.0
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
3.1014 | 0.1036 | 100 | 3.1111 | 1.0 |
2.4582 | 0.2073 | 200 | 2.3973 | 1.0 |
0.748 | 0.3109 | 300 | 0.4278 | 0.4305 |
0.4825 | 0.4145 | 400 | 0.2887 | 0.3229 |
0.403 | 0.5181 | 500 | 0.2650 | 0.3048 |
0.3605 | 0.6218 | 600 | 0.2223 | 0.2620 |
0.4186 | 0.7254 | 700 | 0.2019 | 0.2464 |
0.3236 | 0.8290 | 800 | 0.2005 | 0.2550 |
0.2701 | 0.9326 | 900 | 0.1887 | 0.2265 |
0.1819 | 1.0363 | 1000 | 0.1503 | 0.1940 |
0.2301 | 1.1399 | 1100 | 0.1445 | 0.1915 |
0.2947 | 1.2435 | 1200 | 0.1317 | 0.1778 |
0.1691 | 1.3472 | 1300 | 0.1418 | 0.1898 |
0.1543 | 1.4508 | 1400 | 0.1255 | 0.1751 |
0.1268 | 1.5544 | 1500 | 0.1263 | 0.1661 |
0.1564 | 1.6580 | 1600 | 0.1234 | 0.1703 |
0.2848 | 1.7617 | 1700 | 0.1187 | 0.1639 |
0.1379 | 1.8653 | 1800 | 0.1145 | 0.1547 |
0.1086 | 1.9689 | 1900 | 0.1118 | 0.1507 |
0.1233 | 2.0725 | 2000 | 0.1165 | 0.1541 |
0.1097 | 2.1762 | 2100 | 0.1066 | 0.1502 |
0.1336 | 2.2798 | 2200 | 0.1076 | 0.1496 |
0.1029 | 2.3834 | 2300 | 0.1012 | 0.1398 |
0.1358 | 2.4870 | 2400 | 0.1056 | 0.1445 |
0.1498 | 2.5907 | 2500 | 0.1046 | 0.1482 |
0.175 | 2.6943 | 2600 | 0.1048 | 0.1410 |
0.1134 | 2.7979 | 2700 | 0.1077 | 0.1438 |
0.1702 | 2.9016 | 2800 | 0.0949 | 0.1359 |
0.1405 | 3.0052 | 2900 | 0.1061 | 0.1482 |
0.1361 | 3.1088 | 3000 | 0.0961 | 0.1331 |
0.1352 | 3.2124 | 3100 | 0.0948 | 0.1337 |
0.1007 | 3.3161 | 3200 | 0.1020 | 0.1395 |
0.0682 | 3.4197 | 3300 | 0.1021 | 0.1449 |
0.0588 | 3.5233 | 3400 | 0.0980 | 0.1362 |
0.1053 | 3.6269 | 3500 | 0.1011 | 0.1413 |
0.1143 | 3.7306 | 3600 | 0.0984 | 0.1335 |
0.1011 | 3.8342 | 3700 | 0.0986 | 0.1340 |
0.0832 | 3.9378 | 3800 | 0.0927 | 0.1325 |
0.0984 | 4.0415 | 3900 | 0.0955 | 0.1296 |
0.0827 | 4.1451 | 4000 | 0.0888 | 0.1236 |
0.1832 | 4.2487 | 4100 | 0.0954 | 0.1313 |
0.0858 | 4.3523 | 4200 | 0.0923 | 0.1281 |
0.089 | 4.4560 | 4300 | 0.0891 | 0.1228 |
0.0997 | 4.5596 | 4400 | 0.0841 | 0.1242 |
0.0709 | 4.6632 | 4500 | 0.0885 | 0.1248 |
0.1477 | 4.7668 | 4600 | 0.0871 | 0.1230 |
0.0576 | 4.8705 | 4700 | 0.0855 | 0.1229 |
0.102 | 4.9741 | 4800 | 0.0872 | 0.1216 |
0.1983 | 5.0777 | 4900 | 0.0878 | 0.1193 |
0.1069 | 5.1813 | 5000 | 0.0910 | 0.1186 |
0.0641 | 5.2850 | 5100 | 0.0894 | 0.1196 |
0.1391 | 5.3886 | 5200 | 0.0879 | 0.1192 |
0.1436 | 5.4922 | 5300 | 0.0820 | 0.1164 |
0.1161 | 5.5959 | 5400 | 0.0933 | 0.1224 |
0.0874 | 5.6995 | 5500 | 0.0845 | 0.1188 |
0.1789 | 5.8031 | 5600 | 0.0823 | 0.1172 |
0.169 | 5.9067 | 5700 | 0.0806 | 0.1131 |
0.0585 | 6.0104 | 5800 | 0.0831 | 0.1146 |
0.0838 | 6.1140 | 5900 | 0.0871 | 0.1171 |
0.0645 | 6.2176 | 6000 | 0.0845 | 0.1179 |
0.0569 | 6.3212 | 6100 | 0.0843 | 0.1185 |
0.1063 | 6.4249 | 6200 | 0.0860 | 0.1196 |
0.0975 | 6.5285 | 6300 | 0.0813 | 0.1158 |
0.0629 | 6.6321 | 6400 | 0.0823 | 0.1166 |
0.0948 | 6.7358 | 6500 | 0.0793 | 0.1122 |
0.0469 | 6.8394 | 6600 | 0.0798 | 0.1146 |
0.0618 | 6.9430 | 6700 | 0.0811 | 0.1108 |
0.053 | 7.0466 | 6800 | 0.0808 | 0.1087 |
0.0775 | 7.1503 | 6900 | 0.0807 | 0.1104 |
0.0626 | 7.2539 | 7000 | 0.0879 | 0.1119 |
0.0852 | 7.3575 | 7100 | 0.0885 | 0.1149 |
0.0741 | 7.4611 | 7200 | 0.0832 | 0.1156 |
0.0673 | 7.5648 | 7300 | 0.0842 | 0.1085 |
0.0793 | 7.6684 | 7400 | 0.0842 | 0.1084 |
0.0547 | 7.7720 | 7500 | 0.0842 | 0.1083 |
0.1072 | 7.8756 | 7600 | 0.0781 | 0.1058 |
0.0551 | 7.9793 | 7700 | 0.0803 | 0.1060 |
0.0623 | 8.0829 | 7800 | 0.0840 | 0.1069 |
0.0508 | 8.1865 | 7900 | 0.0848 | 0.1069 |
0.0705 | 8.2902 | 8000 | 0.0789 | 0.1090 |
0.0425 | 8.3938 | 8100 | 0.0797 | 0.1069 |
0.0772 | 8.4974 | 8200 | 0.0824 | 0.1084 |
0.0494 | 8.6010 | 8300 | 0.0810 | 0.1048 |
0.0511 | 8.7047 | 8400 | 0.0810 | 0.1056 |
0.049 | 8.8083 | 8500 | 0.0782 | 0.1035 |
0.0462 | 8.9119 | 8600 | 0.0787 | 0.1057 |
0.0529 | 9.0155 | 8700 | 0.0831 | 0.1067 |
0.0211 | 9.1192 | 8800 | 0.0784 | 0.1034 |
0.0357 | 9.2228 | 8900 | 0.0826 | 0.1037 |
0.0336 | 9.3264 | 9000 | 0.0778 | 0.1016 |
0.0314 | 9.4301 | 9100 | 0.0844 | 0.1021 |
0.0558 | 9.5337 | 9200 | 0.0868 | 0.1023 |
0.0379 | 9.6373 | 9300 | 0.0786 | 0.1018 |
0.0302 | 9.7409 | 9400 | 0.0800 | 0.1028 |
0.0508 | 9.8446 | 9500 | 0.0795 | 0.1010 |
0.0407 | 9.9482 | 9600 | 0.0826 | 0.1054 |
0.0462 | 10.0518 | 9700 | 0.0856 | 0.1042 |
0.026 | 10.1554 | 9800 | 0.0839 | 0.1017 |
0.0615 | 10.2591 | 9900 | 0.0857 | 0.1014 |
0.119 | 10.3627 | 10000 | 0.0827 | 0.1018 |
0.0761 | 10.4663 | 10100 | 0.0844 | 0.1020 |
0.0682 | 10.5699 | 10200 | 0.0859 | 0.1006 |
0.1278 | 10.6736 | 10300 | 0.0837 | 0.1029 |
0.1056 | 10.7772 | 10400 | 0.0829 | 0.1011 |
0.0722 | 10.8808 | 10500 | 0.0844 | 0.0991 |
0.0777 | 10.9845 | 10600 | 0.0824 | 0.0985 |
0.024 | 11.0881 | 10700 | 0.0884 | 0.1001 |
0.0264 | 11.1917 | 10800 | 0.0859 | 0.0998 |
0.0315 | 11.2953 | 10900 | 0.0850 | 0.1007 |
0.0321 | 11.3990 | 11000 | 0.0850 | 0.0998 |
0.0237 | 11.5026 | 11100 | 0.0846 | 0.0994 |
0.0288 | 11.6062 | 11200 | 0.0854 | 0.0994 |
0.028 | 11.7098 | 11300 | 0.0845 | 0.0973 |
0.0217 | 11.8135 | 11400 | 0.0827 | 0.0988 |
0.0197 | 11.9171 | 11500 | 0.0857 | 0.0966 |
0.0281 | 12.0207 | 11600 | 0.0868 | 0.0974 |
0.0189 | 12.1244 | 11700 | 0.0892 | 0.0965 |
0.0412 | 12.2280 | 11800 | 0.0893 | 0.0954 |
0.0311 | 12.3316 | 11900 | 0.0912 | 0.0964 |
0.0371 | 12.4352 | 12000 | 0.0880 | 0.0958 |
0.043 | 12.5389 | 12100 | 0.0878 | 0.0957 |
0.0423 | 12.6425 | 12200 | 0.0882 | 0.0967 |
0.0255 | 12.7461 | 12300 | 0.0893 | 0.0957 |
0.0274 | 12.8497 | 12400 | 0.0886 | 0.0949 |
0.0362 | 12.9534 | 12500 | 0.0869 | 0.0945 |
0.0154 | 13.0570 | 12600 | 0.0879 | 0.0936 |
0.02 | 13.1606 | 12700 | 0.0888 | 0.0933 |
0.0095 | 13.2642 | 12800 | 0.0910 | 0.0936 |
0.0261 | 13.3679 | 12900 | 0.0901 | 0.0929 |
0.0264 | 13.4715 | 13000 | 0.0893 | 0.0939 |
0.0185 | 13.5751 | 13100 | 0.0892 | 0.0930 |
0.0172 | 13.6788 | 13200 | 0.0901 | 0.0922 |
0.0177 | 13.7824 | 13300 | 0.0902 | 0.0935 |
0.0089 | 13.8860 | 13400 | 0.0906 | 0.0929 |
0.0163 | 13.9896 | 13500 | 0.0903 | 0.0922 |
0.0279 | 14.0933 | 13600 | 0.0908 | 0.0919 |
0.0332 | 14.1969 | 13700 | 0.0906 | 0.0916 |
0.0109 | 14.3005 | 13800 | 0.0921 | 0.0915 |
0.0248 | 14.4041 | 13900 | 0.0921 | 0.0914 |
0.015 | 14.5078 | 14000 | 0.0923 | 0.0910 |
0.0171 | 14.6114 | 14100 | 0.0920 | 0.0906 |
0.0027 | 14.7150 | 14200 | 0.0916 | 0.0904 |
0.0193 | 14.8187 | 14300 | 0.0913 | 0.0906 |
0.0173 | 14.9223 | 14400 | 0.0915 | 0.0909 |
Framework versions
- Transformers 4.47.0
- Pytorch 2.5.1+cu121
- Datasets 3.3.1
- Tokenizers 0.21.0
- Downloads last month
- 69
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for Panga-Azazia/bambara-asr
Base model
facebook/wav2vec2-xls-r-300m