Whisper Medium Basque

This model is a fine-tuned version of openai/whisper-medium on the mozilla-foundation/common_voice_17_0 eu dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3147
  • Wer: 7.7586

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3.75e-05
  • train_batch_size: 64
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 40000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
0.0757 2.3474 1000 0.1944 11.3425
0.0341 4.6948 2000 0.2069 10.3402
0.0167 7.0423 3000 0.2218 9.6513
0.0109 9.3897 4000 0.2235 9.3508
0.0104 11.7371 5000 0.2353 9.7750
0.0077 14.0845 6000 0.2396 9.7118
0.0052 16.4319 7000 0.2522 9.3985
0.0059 18.7793 8000 0.2482 9.4883
0.0075 21.1268 9000 0.2527 9.8034
0.0048 23.4742 10000 0.2603 9.2785
0.0033 25.8216 11000 0.2584 9.0778
0.0039 28.1690 12000 0.2630 9.0962
0.0036 30.5164 13000 0.2666 9.3646
0.0031 32.8638 14000 0.2695 9.4342
0.0019 35.2113 15000 0.2687 9.0384
0.0025 37.5587 16000 0.2737 9.7768
0.0016 39.9061 17000 0.2789 8.8690
0.0011 42.2535 18000 0.2796 8.6280
0.0006 44.6009 19000 0.2774 8.9422
0.0025 46.9484 20000 0.2801 9.0458
0.0006 49.2958 21000 0.2815 8.6015
0.0016 51.6432 22000 0.2869 9.2134
0.0009 53.9906 23000 0.2873 8.9716
0.0007 56.3380 24000 0.2848 9.0183
0.0006 58.6854 25000 0.2849 8.7489
0.0004 61.0329 26000 0.2960 8.9514
0.0007 63.3803 27000 0.2874 8.7709
0.0005 65.7277 28000 0.2929 8.9193
0.0 68.0751 29000 0.2901 8.4988
0.0 70.4225 30000 0.2943 8.1296
0.0 72.7700 31000 0.2997 8.0307
0.0 75.1174 32000 0.3037 7.9812
0.0 77.4648 33000 0.3072 7.9483
0.0 79.8122 34000 0.3092 7.8667
0.0 82.1596 35000 0.3121 7.8264
0.0 84.5070 36000 0.3147 7.7586
0.0001 86.8545 37000 0.3071 8.1003
0.0 89.2019 38000 0.3069 8.0426
0.0 91.5493 39000 0.3081 8.0316
0.0 93.8967 40000 0.3088 8.0124

Framework versions

  • Transformers 4.52.3
  • Pytorch 2.6.0+cu124
  • Datasets 3.6.0
  • Tokenizers 0.21.1
Downloads last month
134
Safetensors
Model size
764M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for zuazo/whisper-medium-eu-cv17_0

Finetuned
(669)
this model

Dataset used to train zuazo/whisper-medium-eu-cv17_0

Evaluation results