csikasote's picture
End of training
ebacd4b verified
metadata
library_name: transformers
license: cc-by-nc-4.0
base_model: mms-meta/mms-zeroshot-300m
tags:
  - automatic-speech-recognition
  - BembaSpeech
  - mms
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: mms-zeroshot-bem-sv-female
    results: []

mms-zeroshot-bem-sv-female

This model is a fine-tuned version of mms-meta/mms-zeroshot-300m on the BEMBASPEECH - BEM dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2722
  • Wer: 0.4375

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 10.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
No log 0.3992 200 1.9434 1.0
No log 0.7984 400 0.3648 0.5411
2.5692 1.1976 600 0.3371 0.5174
2.5692 1.5968 800 0.3229 0.5213
0.3941 1.9960 1000 0.3183 0.4915
0.3941 2.3952 1200 0.3068 0.5073
0.3941 2.7944 1400 0.3057 0.4688
0.3502 3.1936 1600 0.3017 0.4777
0.3502 3.5928 1800 0.2905 0.4647
0.3253 3.9920 2000 0.2857 0.4686
0.3253 4.3912 2200 0.2892 0.4601
0.3253 4.7904 2400 0.2848 0.4759
0.3066 5.1896 2600 0.2801 0.4444
0.3066 5.5888 2800 0.2752 0.4627
0.2988 5.9880 3000 0.2818 0.4614
0.2988 6.3872 3200 0.2759 0.4444
0.2988 6.7864 3400 0.2751 0.4382
0.2877 7.1856 3600 0.2726 0.4472
0.2877 7.5848 3800 0.2722 0.4484
0.2812 7.9840 4000 0.2710 0.4344
0.2812 8.3832 4200 0.2734 0.4410
0.2812 8.7824 4400 0.2734 0.4360
0.2742 9.1816 4600 0.2759 0.4398
0.2742 9.5808 4800 0.2740 0.4337
0.2731 9.9800 5000 0.2722 0.4382

Framework versions

  • Transformers 4.45.0.dev0
  • Pytorch 2.4.0+cu121
  • Datasets 2.21.0
  • Tokenizers 0.19.1