wav2vec2-xls-r-2b-faroese-100h-30-epochs_v20250102
This model is a fine-tuned version of facebook/wav2vec2-xls-r-2b on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1139
- Wer: 18.6633
- Cer: 4.0333
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 6000
- num_epochs: 30
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
1.8998 | 0.4877 | 1000 | 0.4521 | 49.7995 | 14.9781 |
1.4942 | 0.9754 | 2000 | 0.2517 | 33.0440 | 8.5873 |
1.6019 | 1.4628 | 3000 | 0.2145 | 30.8631 | 7.8267 |
1.306 | 1.9505 | 4000 | 0.2134 | 30.9028 | 7.9183 |
0.693 | 2.4379 | 5000 | 0.2248 | 31.6165 | 8.2299 |
0.7777 | 2.9256 | 6000 | 0.2083 | 30.8102 | 7.9735 |
0.7026 | 3.4131 | 7000 | 0.2166 | 31.1054 | 8.1226 |
0.6333 | 3.9008 | 8000 | 0.2145 | 30.4534 | 7.9135 |
0.5643 | 4.3882 | 9000 | 0.1924 | 28.5897 | 7.2910 |
0.5885 | 4.8759 | 10000 | 0.1930 | 29.0038 | 7.4007 |
0.5157 | 5.3633 | 11000 | 0.1756 | 27.6953 | 6.9470 |
0.4751 | 5.8510 | 12000 | 0.1788 | 27.1754 | 6.9241 |
0.398 | 6.3385 | 13000 | 0.1640 | 27.0917 | 6.6843 |
0.4065 | 6.8261 | 14000 | 0.1658 | 27.3560 | 6.8445 |
0.3841 | 7.3136 | 15000 | 0.1619 | 26.1665 | 6.4294 |
0.3785 | 7.8013 | 16000 | 0.1623 | 25.7347 | 6.3434 |
0.3045 | 8.2887 | 17000 | 0.1600 | 25.5452 | 6.2559 |
0.345 | 8.7764 | 18000 | 0.1502 | 25.3029 | 6.1770 |
0.2431 | 9.2638 | 19000 | 0.1517 | 24.7786 | 6.0634 |
0.2758 | 9.7515 | 20000 | 0.1484 | 24.3556 | 5.8432 |
0.2415 | 10.2390 | 21000 | 0.1461 | 23.8225 | 5.7959 |
0.2328 | 10.7267 | 22000 | 0.1413 | 23.6155 | 5.6744 |
0.229 | 11.2141 | 23000 | 0.1379 | 23.6727 | 5.5450 |
0.2182 | 11.7018 | 24000 | 0.1402 | 23.3819 | 5.5253 |
0.1979 | 12.1892 | 25000 | 0.1357 | 23.1617 | 5.4582 |
0.1805 | 12.6769 | 26000 | 0.1327 | 22.5228 | 5.3162 |
0.1813 | 13.1644 | 27000 | 0.1268 | 22.6990 | 5.2744 |
0.1841 | 13.6520 | 28000 | 0.1329 | 22.3642 | 5.2428 |
0.1683 | 14.1395 | 29000 | 0.1303 | 22.3862 | 5.1955 |
0.1686 | 14.6272 | 30000 | 0.1257 | 22.3422 | 5.1892 |
0.1472 | 15.1146 | 31000 | 0.1294 | 22.1880 | 5.1410 |
0.1436 | 15.6023 | 32000 | 0.1221 | 21.5403 | 4.9414 |
0.1349 | 16.0897 | 33000 | 0.1228 | 21.5755 | 4.9493 |
0.1234 | 16.5774 | 34000 | 0.1316 | 21.6240 | 4.9927 |
0.1248 | 17.0649 | 35000 | 0.1181 | 21.2936 | 4.8215 |
0.1241 | 17.5525 | 36000 | 0.1212 | 21.3817 | 4.8886 |
0.1033 | 18.0400 | 37000 | 0.1347 | 21.5932 | 4.9619 |
0.0939 | 18.5277 | 38000 | 0.1245 | 21.2187 | 4.8325 |
0.0985 | 19.0151 | 39000 | 0.1173 | 20.9191 | 4.7063 |
0.0899 | 19.5028 | 40000 | 0.1282 | 20.8530 | 4.7284 |
0.0895 | 19.9905 | 41000 | 0.1322 | 20.8089 | 4.6676 |
0.0968 | 20.4779 | 42000 | 0.1202 | 20.4785 | 4.5532 |
0.0905 | 20.9656 | 43000 | 0.1241 | 20.4785 | 4.5777 |
0.0832 | 21.4531 | 44000 | 0.1160 | 20.3199 | 4.5438 |
0.0958 | 21.9407 | 45000 | 0.1193 | 20.1172 | 4.4467 |
0.0689 | 22.4282 | 46000 | 0.1152 | 19.8440 | 4.3970 |
0.0724 | 22.9159 | 47000 | 0.1173 | 19.8705 | 4.3781 |
0.0714 | 23.4033 | 48000 | 0.1147 | 19.8352 | 4.3994 |
0.0645 | 23.8910 | 49000 | 0.1165 | 19.4475 | 4.3047 |
0.0659 | 24.3784 | 50000 | 0.1181 | 19.6766 | 4.3355 |
0.0503 | 24.8661 | 51000 | 0.1168 | 19.4123 | 4.2487 |
0.0619 | 25.3536 | 52000 | 0.1147 | 19.0598 | 4.1800 |
0.0589 | 25.8413 | 53000 | 0.1165 | 19.2008 | 4.1879 |
0.0574 | 26.3287 | 54000 | 0.1152 | 19.0201 | 4.1516 |
0.0475 | 26.8164 | 55000 | 0.1109 | 18.9849 | 4.1185 |
0.0437 | 27.3038 | 56000 | 0.1191 | 19.0378 | 4.1469 |
0.0465 | 27.7915 | 57000 | 0.1170 | 18.9276 | 4.1153 |
0.0514 | 28.2790 | 58000 | 0.1144 | 18.8571 | 4.0972 |
0.0481 | 28.7666 | 59000 | 0.1114 | 18.6589 | 4.0270 |
0.0586 | 29.2541 | 60000 | 0.1120 | 18.6677 | 4.0428 |
0.057 | 29.7418 | 61000 | 0.1139 | 18.6633 | 4.0333 |
Framework versions
- Transformers 4.47.1
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 0
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for davidilag/wav2vec2-xls-r-2b-faroese-100h-30-epochs_v20250102
Base model
facebook/wav2vec2-xls-r-2b