ayymen's picture
Update README.md
cbbf213 verified
---
library_name: transformers
license: cc-by-nc-4.0
base_model: facebook/mms-1b-all
tags:
- automatic-speech-recognition
- /mnt/md0/synvoices/data/naijavoices_500h
- mms
- generated_from_trainer
metrics:
- wer
model-index:
- name: mms-1b-naijavoices_500h-hau-ft
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# mms-1b-naijavoices_500h-hau-ft
This model is a fine-tuned version of [facebook/mms-1b-all](https://huggingface.co/facebook/mms-1b-all) on the /MNT/MD0/SYNVOICES/DATA/NAIJAVOICES_500H - NA dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3121
- Wer: 0.3303
- Cer: 0.0830
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- num_devices: 2
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- total_eval_batch_size: 16
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 2.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Cer | Validation Loss | Wer |
|:-------------:|:------:|:-----:|:------:|:---------------:|:------:|
| 0.5183 | 0.0279 | 500 | 0.1051 | 0.3932 | 0.4076 |
| 0.4379 | 0.0559 | 1000 | 0.1009 | 0.3776 | 0.3964 |
| 0.5709 | 0.0838 | 1500 | 0.1001 | 0.3829 | 0.3935 |
| 0.6257 | 0.1117 | 2000 | 0.0994 | 0.3710 | 0.3930 |
| 0.5065 | 0.1397 | 2500 | 0.1008 | 0.3796 | 0.3937 |
| 0.4613 | 0.1676 | 3000 | 0.0977 | 0.3637 | 0.3849 |
| 0.5305 | 0.1955 | 3500 | 0.0971 | 0.3674 | 0.3801 |
| 0.4303 | 0.2235 | 4000 | 0.0980 | 0.3663 | 0.3875 |
| 0.6497 | 0.2514 | 4500 | 0.0959 | 0.3617 | 0.3741 |
| 0.3887 | 0.2794 | 5000 | 0.0985 | 0.3620 | 0.3925 |
| 0.4604 | 0.3073 | 5500 | 0.3570 | 0.3727 | 0.0947 |
| 0.4349 | 0.3352 | 6000 | 0.3529 | 0.3686 | 0.0940 |
| 0.5403 | 0.3632 | 6500 | 0.3518 | 0.3722 | 0.0941 |
| 0.4455 | 0.3911 | 7000 | 0.3522 | 0.3695 | 0.0937 |
| 0.4454 | 0.4190 | 7500 | 0.3527 | 0.3722 | 0.0941 |
| 0.3582 | 0.4470 | 8000 | 0.3479 | 0.3696 | 0.0932 |
| 0.6661 | 0.4749 | 8500 | 0.3453 | 0.3597 | 0.0911 |
| 0.4702 | 0.5028 | 9000 | 0.3472 | 0.3672 | 0.0929 |
| 0.3877 | 0.5308 | 9500 | 0.3467 | 0.3772 | 0.0952 |
| 0.5848 | 0.5587 | 10000 | 0.3422 | 0.3652 | 0.0920 |
| 0.4943 | 0.5866 | 10500 | 0.3444 | 0.3701 | 0.0926 |
| 0.5451 | 0.6146 | 11000 | 0.3441 | 0.3583 | 0.0908 |
| 0.4033 | 0.6425 | 11500 | 0.3424 | 0.3572 | 0.0907 |
| 0.4437 | 0.6704 | 12000 | 0.3427 | 0.3576 | 0.0906 |
| 0.4541 | 0.6984 | 12500 | 0.3375 | 0.3574 | 0.0901 |
| 0.3769 | 0.7263 | 13000 | 0.3381 | 0.3605 | 0.0901 |
| 0.3915 | 0.7543 | 13500 | 0.3357 | 0.3538 | 0.0892 |
| 0.5068 | 0.7822 | 14000 | 0.3373 | 0.3518 | 0.0892 |
| 0.3922 | 0.8101 | 14500 | 0.3362 | 0.3556 | 0.0895 |
| 0.3928 | 0.8381 | 15000 | 0.3337 | 0.3489 | 0.0887 |
| 0.487 | 0.8660 | 15500 | 0.3350 | 0.3580 | 0.0897 |
| 0.3974 | 0.8939 | 16000 | 0.3330 | 0.3545 | 0.0892 |
| 0.3988 | 0.9219 | 16500 | 0.3339 | 0.3468 | 0.0880 |
| 0.5077 | 0.9498 | 17000 | 0.3322 | 0.3529 | 0.0884 |
| 0.4159 | 0.9777 | 17500 | 0.3320 | 0.3509 | 0.0885 |
| 0.46 | 1.0057 | 18000 | 0.3313 | 0.3469 | 0.0881 |
| 0.4727 | 1.0336 | 18500 | 0.3320 | 0.3588 | 0.0893 |
| 0.5366 | 1.0616 | 19000 | 0.3292 | 0.3519 | 0.0891 |
| 0.7299 | 1.0895 | 19500 | 0.3292 | 0.3518 | 0.0882 |
| 0.4359 | 1.1174 | 20000 | 0.3292 | 0.3470 | 0.0872 |
| 0.4212 | 1.1454 | 20500 | 0.3266 | 0.3449 | 0.0873 |
| 0.4532 | 1.1733 | 21000 | 0.3264 | 0.3443 | 0.0868 |
| 0.5725 | 1.2012 | 21500 | 0.3264 | 0.3393 | 0.0857 |
| 0.4016 | 1.2292 | 22000 | 0.3249 | 0.3398 | 0.0861 |
| 0.4479 | 1.2571 | 22500 | 0.3243 | 0.3519 | 0.0875 |
| 0.3502 | 1.2851 | 23000 | 0.3253 | 0.3463 | 0.0867 |
| 0.4566 | 1.3130 | 23500 | 0.3207 | 0.3387 | 0.0854 |
| 0.4414 | 1.3409 | 24000 | 0.3218 | 0.3431 | 0.0858 |
| 0.4479 | 1.3689 | 24500 | 0.3243 | 0.3445 | 0.0864 |
| 0.4601 | 1.3968 | 25000 | 0.3197 | 0.3405 | 0.0858 |
| 0.4091 | 1.4247 | 25500 | 0.3219 | 0.3371 | 0.0851 |
| 0.3548 | 1.4527 | 26000 | 0.3207 | 0.3417 | 0.0856 |
| 0.4587 | 1.4806 | 26500 | 0.3183 | 0.3360 | 0.0851 |
| 0.51 | 1.5085 | 27000 | 0.3196 | 0.3393 | 0.0853 |
| 0.4705 | 1.5365 | 27500 | 0.3187 | 0.3382 | 0.0853 |
| 0.5046 | 1.5644 | 28000 | 0.3194 | 0.3337 | 0.0843 |
| 0.4924 | 1.5923 | 28500 | 0.3149 | 0.3327 | 0.0840 |
| 0.3216 | 1.6203 | 29000 | 0.3173 | 0.3395 | 0.0850 |
| 0.4593 | 1.6482 | 29500 | 0.3156 | 0.3297 | 0.0833 |
| 0.3163 | 1.6761 | 30000 | 0.3141 | 0.3355 | 0.0843 |
| 0.3492 | 1.7041 | 30500 | 0.3147 | 0.3337 | 0.0840 |
| 0.4529 | 1.7320 | 31000 | 0.3150 | 0.3363 | 0.0843 |
| 0.2976 | 1.7600 | 31500 | 0.3149 | 0.3320 | 0.0838 |
| 0.3691 | 1.7879 | 32000 | 0.3141 | 0.3313 | 0.0836 |
| 0.3154 | 1.8158 | 32500 | 0.3128 | 0.3296 | 0.0831 |
| 0.429 | 1.8438 | 33000 | 0.3134 | 0.3346 | 0.0841 |
| 0.3474 | 1.8717 | 33500 | 0.3136 | 0.3311 | 0.0836 |
| 0.4074 | 1.8996 | 34000 | 0.3130 | 0.3318 | 0.0832 |
| 0.3559 | 1.9276 | 34500 | 0.3123 | 0.3334 | 0.0835 |
| 0.3606 | 1.9555 | 35000 | 0.3119 | 0.3318 | 0.0834 |
| 0.4125 | 1.9834 | 35500 | 0.3123 | 0.3301 | 0.0830 |
### Framework versions
- Transformers 4.48.1
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0