metadata
library_name: transformers
language:
- lg
base_model: asr-africa/wav2vec2-asr-africa-base
tags:
- asr
- luganda
- wav2vec2-base
- speech
- asr-africa
- robust-fine-tuning
- lg
- generated_from_trainer
datasets:
- mozilla-foundation/common_voice_17_0
- google/fleurs
metrics:
- wer
model-index:
- name: Wav2Vec2-Base - Luganda - asr-africa
results:
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: common_voice_17_0
type: mozilla-foundation/common_voice_17_0
metrics:
- name: Wer
type: wer
value: 0.1883851956379634
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: fleurs
type: google/fleurs
metrics:
- name: Wer
type: wer
value: 0.1883851956379634
Wav2Vec2-Base - Luganda - asr-africa
This model is a fine-tuned version of asr-africa/wav2vec2-asr-africa-base on the common_voice_17_0 and the fleurs datasets. It achieves the following results on the evaluation set:
- Loss: 0.1244
- Wer: 0.1884
- Cer: 0.0350
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 7e-05
- train_batch_size: 64
- eval_batch_size: 32
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.01
- num_epochs: 100.0
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Cer | Validation Loss | Wer |
---|---|---|---|---|---|
3.2575 | 1.0 | 3012 | 0.1093 | 0.3726 | 0.5341 |
0.8396 | 2.0 | 6024 | 0.0781 | 0.2538 | 0.3984 |
0.7487 | 3.0 | 9036 | 0.0682 | 0.2277 | 0.3531 |
0.7226 | 4.0 | 12048 | 0.0643 | 0.2133 | 0.3364 |
0.7096 | 5.0 | 15060 | 0.0612 | 0.2084 | 0.3211 |
0.6979 | 6.0 | 18072 | 0.0640 | 0.2091 | 0.3287 |
0.6899 | 7.0 | 21084 | 0.0608 | 0.2019 | 0.3162 |
0.6765 | 8.0 | 24096 | 0.0601 | 0.1973 | 0.3106 |
0.6701 | 9.0 | 27108 | 0.0582 | 0.1928 | 0.3047 |
0.6621 | 10.0 | 30120 | 0.0582 | 0.1924 | 0.3039 |
0.6554 | 11.0 | 33132 | 0.0566 | 0.1867 | 0.2983 |
0.6475 | 12.0 | 36144 | 0.0552 | 0.1829 | 0.2874 |
0.6429 | 13.0 | 39156 | 0.0542 | 0.1802 | 0.2853 |
0.6351 | 14.0 | 42168 | 0.0553 | 0.1826 | 0.2873 |
0.6319 | 15.0 | 45180 | 0.0544 | 0.1793 | 0.2832 |
0.6251 | 16.0 | 48192 | 0.0548 | 0.1785 | 0.2838 |
0.6172 | 17.0 | 51204 | 0.0517 | 0.1709 | 0.2719 |
0.6122 | 18.0 | 54216 | 0.0521 | 0.1720 | 0.2716 |
0.6068 | 19.0 | 57228 | 0.0505 | 0.1694 | 0.2665 |
0.6035 | 20.0 | 60240 | 0.0497 | 0.1670 | 0.2628 |
0.5957 | 21.0 | 63252 | 0.0504 | 0.1704 | 0.2644 |
0.5909 | 22.0 | 66264 | 0.0493 | 0.1653 | 0.2599 |
0.5879 | 23.0 | 69276 | 0.0487 | 0.1675 | 0.2573 |
0.5966 | 24.0 | 72288 | 0.0510 | 0.1943 | 0.2739 |
0.6444 | 25.0 | 75300 | 0.0515 | 0.1868 | 0.2723 |
0.5999 | 26.0 | 78312 | 0.0491 | 0.1677 | 0.2578 |
0.5911 | 27.0 | 81324 | 0.0474 | 0.1679 | 0.2510 |
0.586 | 28.0 | 84336 | 0.0484 | 0.1723 | 0.2539 |
0.5816 | 29.0 | 87348 | 0.0477 | 0.1678 | 0.2526 |
0.5886 | 30.0 | 90360 | 0.0499 | 0.1824 | 0.2629 |
0.5978 | 31.0 | 93372 | 0.0470 | 0.1620 | 0.2491 |
0.5722 | 32.0 | 96384 | 0.0465 | 0.1584 | 0.2472 |
0.5615 | 33.0 | 99396 | 0.0461 | 0.1564 | 0.2421 |
0.5566 | 34.0 | 102408 | 0.0448 | 0.1530 | 0.2368 |
0.5514 | 35.0 | 105420 | 0.0432 | 0.1499 | 0.2309 |
0.5485 | 36.0 | 108432 | 0.0436 | 0.1511 | 0.2308 |
0.5451 | 37.0 | 111444 | 0.0439 | 0.1507 | 0.2319 |
0.5433 | 38.0 | 114456 | 0.0434 | 0.1482 | 0.2312 |
0.5391 | 39.0 | 117468 | 0.0435 | 0.1468 | 0.2291 |
0.5347 | 40.0 | 120480 | 0.0430 | 0.1463 | 0.2274 |
0.5313 | 41.0 | 123492 | 0.0422 | 0.1450 | 0.2240 |
0.5291 | 42.0 | 126504 | 0.0419 | 0.1446 | 0.2241 |
0.5269 | 43.0 | 129516 | 0.0427 | 0.1453 | 0.2255 |
0.5253 | 44.0 | 132528 | 0.0425 | 0.1446 | 0.2253 |
0.523 | 45.0 | 135540 | 0.0412 | 0.1430 | 0.2202 |
0.5192 | 46.0 | 138552 | 0.0409 | 0.1414 | 0.2172 |
0.518 | 47.0 | 141564 | 0.0405 | 0.1404 | 0.2160 |
0.5139 | 48.0 | 144576 | 0.0401 | 0.1400 | 0.2143 |
0.5133 | 49.0 | 147588 | 0.0412 | 0.1414 | 0.2180 |
0.5114 | 50.0 | 150600 | 0.0404 | 0.1402 | 0.2149 |
0.5087 | 51.0 | 153612 | 0.0406 | 0.1404 | 0.2165 |
0.5066 | 52.0 | 156624 | 0.0404 | 0.1389 | 0.2157 |
0.5037 | 53.0 | 159636 | 0.0398 | 0.1375 | 0.2132 |
0.5024 | 54.0 | 162648 | 0.0398 | 0.1372 | 0.2121 |
0.5 | 55.0 | 165660 | 0.0401 | 0.1379 | 0.2132 |
0.4976 | 56.0 | 168672 | 0.0386 | 0.1349 | 0.2072 |
0.4948 | 57.0 | 171684 | 0.0393 | 0.1362 | 0.2102 |
0.4933 | 58.0 | 174696 | 0.0389 | 0.1355 | 0.2068 |
0.4924 | 59.0 | 177708 | 0.0385 | 0.1361 | 0.2055 |
0.4901 | 60.0 | 180720 | 0.0384 | 0.1346 | 0.2054 |
0.4898 | 61.0 | 183732 | 0.0384 | 0.1334 | 0.2050 |
0.4873 | 62.0 | 186744 | 0.0384 | 0.1342 | 0.2060 |
0.4865 | 63.0 | 189756 | 0.0387 | 0.1346 | 0.2070 |
0.4842 | 64.0 | 192768 | 0.0387 | 0.1346 | 0.2072 |
0.4822 | 65.0 | 195780 | 0.0381 | 0.1325 | 0.2040 |
0.4814 | 66.0 | 198792 | 0.0371 | 0.1312 | 0.1989 |
0.4796 | 67.0 | 201804 | 0.0374 | 0.1312 | 0.2000 |
0.4771 | 68.0 | 204816 | 0.0372 | 0.1304 | 0.1997 |
0.4756 | 69.0 | 207828 | 0.0377 | 0.1308 | 0.2009 |
0.4745 | 70.0 | 210840 | 0.0370 | 0.1312 | 0.1982 |
0.4738 | 71.0 | 213852 | 0.0374 | 0.1307 | 0.2001 |
0.473 | 72.0 | 216864 | 0.0372 | 0.1307 | 0.1991 |
0.472 | 73.0 | 219876 | 0.0366 | 0.1292 | 0.1961 |
0.4693 | 74.0 | 222888 | 0.0364 | 0.1287 | 0.1952 |
0.4693 | 75.0 | 225900 | 0.0363 | 0.1284 | 0.1945 |
0.4664 | 76.0 | 228912 | 0.0368 | 0.1288 | 0.1969 |
0.4651 | 77.0 | 231924 | 0.0368 | 0.1287 | 0.1971 |
0.4641 | 78.0 | 234936 | 0.0366 | 0.1287 | 0.1952 |
0.462 | 79.0 | 237948 | 0.0364 | 0.1287 | 0.1945 |
0.4608 | 80.0 | 240960 | 0.0363 | 0.1275 | 0.1952 |
0.4594 | 81.0 | 243972 | 0.0361 | 0.1277 | 0.1939 |
0.4595 | 82.0 | 246984 | 0.0359 | 0.1268 | 0.1937 |
0.4575 | 83.0 | 249996 | 0.0362 | 0.1272 | 0.1942 |
0.4569 | 84.0 | 253008 | 0.1268 | 0.1934 | 0.0361 |
0.4552 | 85.0 | 256020 | 0.1262 | 0.1916 | 0.0357 |
0.4538 | 86.0 | 259032 | 0.1259 | 0.1907 | 0.0355 |
0.4532 | 87.0 | 262044 | 0.1258 | 0.1912 | 0.0355 |
0.4524 | 88.0 | 265056 | 0.1260 | 0.1910 | 0.0356 |
0.4501 | 89.0 | 268068 | 0.1266 | 0.1928 | 0.0360 |
0.4491 | 90.0 | 271080 | 0.1252 | 0.1904 | 0.0355 |
0.4486 | 91.0 | 274092 | 0.1253 | 0.1889 | 0.0352 |
0.4487 | 92.0 | 277104 | 0.1253 | 0.1902 | 0.0354 |
0.4471 | 93.0 | 280116 | 0.1252 | 0.1894 | 0.0352 |
0.4458 | 94.0 | 283128 | 0.1253 | 0.1891 | 0.0352 |
0.4449 | 95.0 | 286140 | 0.1248 | 0.1884 | 0.0351 |
0.4434 | 96.0 | 289152 | 0.1247 | 0.1891 | 0.0351 |
0.4435 | 97.0 | 292164 | 0.1247 | 0.1891 | 0.0352 |
0.4444 | 98.0 | 295176 | 0.1245 | 0.1888 | 0.0351 |
0.4429 | 99.0 | 298188 | 0.1244 | 0.1887 | 0.0351 |
0.4426 | 100.0 | 301200 | 0.1244 | 0.1884 | 0.0350 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.7.0+cu128
- Datasets 3.6.0
- Tokenizers 0.21.1