Whisper Small Basque

This model is a fine-tuned version of openai/whisper-small on the mozilla-foundation/common_voice_17_0 eu dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3438
  • Wer: 9.3325

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3.75e-05
  • train_batch_size: 64
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 40000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
0.0781 2.3474 1000 0.2094 13.2938
0.03 4.6948 2000 0.2140 11.9132
0.0145 7.0423 3000 0.2377 11.5541
0.0086 9.3897 4000 0.2430 11.2078
0.0078 11.7371 5000 0.2533 11.2280
0.0061 14.0845 6000 0.2634 10.8927
0.0043 16.4319 7000 0.2667 10.6600
0.0053 18.7793 8000 0.2694 11.0154
0.0027 21.1268 9000 0.2741 10.7104
0.004 23.4742 10000 0.2827 11.2225
0.0041 25.8216 11000 0.2796 10.7232
0.0025 28.1690 12000 0.2887 10.7232
0.0032 30.5164 13000 0.2907 10.6820
0.0026 32.8638 14000 0.2950 10.9202
0.0016 35.2113 15000 0.2947 10.7672
0.0012 37.5587 16000 0.2980 10.4694
0.0015 39.9061 17000 0.2999 10.7030
0.0018 42.2535 18000 0.2982 10.8844
0.0008 44.6009 19000 0.3052 10.5638
0.002 46.9484 20000 0.3007 10.6966
0.0012 49.2958 21000 0.3069 10.3659
0.0017 51.6432 22000 0.3111 10.6655
0.0004 53.9906 23000 0.3161 10.0324
0.0006 56.3380 24000 0.3207 10.2340
0.0005 58.6854 25000 0.3111 10.1176
0.0003 61.0329 26000 0.3168 10.0333
0.0002 63.3803 27000 0.3247 10.0159
0.001 65.7277 28000 0.3191 10.3219
0.0001 68.0751 29000 0.3196 10.0617
0.0005 70.4225 30000 0.3221 10.1726
0.0001 72.7700 31000 0.3189 10.2138
0.0 75.1174 32000 0.3214 9.7759
0.0 77.4648 33000 0.3255 9.6348
0.0 79.8122 34000 0.3290 9.5432
0.0 82.1596 35000 0.3322 9.4864
0.0 84.5070 36000 0.3355 9.4525
0.0 86.8545 37000 0.3383 9.4168
0.0 89.2019 38000 0.3409 9.3921
0.0 91.5493 39000 0.3428 9.3481
0.0 93.8967 40000 0.3438 9.3325

Framework versions

  • Transformers 4.52.3
  • Pytorch 2.6.0+cu124
  • Datasets 3.6.0
  • Tokenizers 0.21.1
Downloads last month
107
Safetensors
Model size
242M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for zuazo/whisper-small-eu-cv17_0

Finetuned
(2748)
this model

Dataset used to train zuazo/whisper-small-eu-cv17_0

Evaluation results