Edit model card

Whisper Medium GA-EN Speech Translation, 1 epoch, 10k steps

This model is a fine-tuned version of openai/whisper-medium on the IWSLT-2023, FLEURS, BiteSize, SpokenWords, Tatoeba, and Wikimedia dataset. It achieves the following results on the evaluation set:

  • Loss: 1.5548
  • Bleu: 31.69
  • Chrf: 50.38
  • Wer: 64.9707

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.03
  • training_steps: 19000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Bleu Chrf Validation Loss Wer
2.4382 0.0109 100 3.07 16.85 2.1114 171.0491
2.6151 0.0219 200 6.25 23.02 2.0207 126.9698
2.5699 0.0328 300 5.71 24.03 1.8660 155.5606
2.3084 0.0438 400 9.87 28.45 1.8084 129.0860
2.3327 0.0547 500 12.01 31.92 1.7823 102.7915
2.1495 0.0657 600 13.97 32.4 1.7238 98.6042
2.2164 0.0766 700 11.21 33.19 1.6538 146.0153
2.0071 0.0876 800 14.34 35.72 1.7038 96.9383
1.8334 0.0985 900 16.51 37.23 1.6329 96.8032
1.8359 0.1095 1000 17.87 35.94 1.6637 84.4665
1.7703 0.1204 1100 19.54 39.02 1.5626 79.7839
1.5805 0.1314 1200 20.19 40.4 1.5618 77.8028
1.4545 0.1423 1300 13.88 35.53 1.5599 112.5619
1.5177 0.1533 1400 18.79 40.11 1.4880 84.6916
1.6335 0.1642 1500 16.41 38.64 1.4996 96.9833
1.3809 0.1752 1600 18.3 40.17 1.4739 101.8910
1.2694 0.1861 1700 22.53 43.15 1.4498 76.9923
1.2321 0.1970 1800 19.92 42.59 1.4163 84.6015
1.1969 0.2080 1900 21.63 44.92 1.4137 85.3670
1.2023 0.2189 2000 20.42 41.57 1.3530 82.8906
1.1676 0.2299 2100 22.82 44.23 1.3723 78.1180
1.0332 0.2408 2200 26.73 44.75 1.3641 70.2386
0.8589 0.2518 2300 26.94 46.89 1.3344 72.7600
0.9829 0.2627 2400 28.15 47.21 1.3181 69.1130
0.8228 0.2737 2500 26.98 47.41 1.3049 74.0207
0.7667 0.2846 2600 30.0 49.42 1.2698 65.1058
0.8749 0.2956 2700 27.91 47.67 1.2878 66.9518
0.7504 0.3065 2800 32.03 50.35 1.2670 63.6650
0.7069 0.3175 2900 30.7 49.53 1.2771 64.4304
0.7199 0.3284 3000 30.21 48.93 1.2658 65.5561
0.6207 0.3394 3100 30.82 49.11 1.2687 66.0063
0.5995 0.3503 3200 31.99 50.94 1.2207 62.9446
0.6294 0.3612 3300 31.05 50.85 1.2422 64.7006
0.4612 0.3722 3400 33.1 51.82 1.2203 61.9090
0.5138 0.3831 3500 32.08 51.86 1.2007 63.0797
0.5059 0.3941 3600 31.8 51.19 1.2130 63.9352
0.417 0.4050 3700 32.45 51.41 1.1975 62.2692
0.2958 0.4160 3800 29.29 51.39 1.2046 62.7645
0.393 0.4269 3900 28.95 51.45 1.1968 63.1697
0.3858 0.4379 4000 29.54 51.58 1.1929 62.4043
0.5416 0.4488 4100 27.29 43.94 1.3522 67.9424
0.6644 0.4598 4200 23.16 44.45 1.4191 77.3976
0.5246 0.4707 4300 22.26 44.91 1.4221 77.2625
0.614 0.4817 4400 26.9 46.15 1.3956 70.4638
0.5973 0.4926 4500 25.55 45.51 1.4152 76.7222
0.544 0.5036 4600 23.54 47.87 1.4091 79.1085
0.5975 0.5145 4700 21.85 42.69 1.4644 78.5682
0.4675 0.5255 4800 22.93 43.69 1.4598 76.9023
0.7959 0.5364 4900 24.91 44.98 1.3884 74.5610
0.5936 0.5473 5000 26.91 44.88 1.4235 69.0680
0.4631 0.5583 5100 25.77 45.81 1.4002 74.0207
0.5188 0.5692 5200 28.37 45.48 1.4405 66.2765
0.4675 0.5802 5300 21.1 43.11 1.4045 92.1207
0.4214 0.5911 5400 25.62 44.82 1.4250 72.2197
0.4592 0.6021 5500 27.24 46.44 1.4107 70.0585
0.4809 0.6130 5600 27.93 47.42 1.3896 69.5182
0.4364 0.6240 5700 25.84 47.47 1.3808 77.6227
0.3333 0.6349 5800 26.46 47.08 1.4203 72.4899
0.3345 0.6459 5900 23.1 44.6 1.4763 81.2247
0.3368 0.6568 6000 24.55 45.76 1.4182 80.5493
0.3061 0.6678 6100 23.1 45.97 1.4218 81.3597
0.324 0.6787 6200 28.26 47.06 1.4453 67.5822
0.2667 0.6897 6300 27.87 46.14 1.4494 69.0230
0.2845 0.7006 6400 26.39 46.72 1.4448 71.4543
0.3125 0.7115 6500 27.81 46.45 1.4643 70.0135
0.264 0.7225 6600 26.27 47.75 1.4244 72.7600
0.2426 0.7334 6700 25.84 46.68 1.4081 76.4070
0.2174 0.7444 6800 30.67 47.92 1.4036 65.8262
0.2265 0.7553 6900 28.11 49.12 1.4174 71.2292
0.2016 0.7663 7000 30.43 49.47 1.4341 65.9163
0.1865 0.7772 7100 32.05 49.5 1.3690 63.1697
0.2148 0.7882 7200 32.29 49.91 1.3603 63.8901
0.2126 0.7991 7300 32.07 49.31 1.4046 63.6650
0.1594 0.8101 7400 29.94 47.48 1.4122 65.5110
0.1295 0.8210 7500 30.14 49.79 1.4243 65.7812
0.1378 0.8320 7600 31.23 49.42 1.4334 65.9613
0.1701 0.8429 7700 31.04 49.95 1.4149 65.6461
0.1102 0.8539 7800 31.37 50.2 1.4082 63.7100
0.1267 0.8648 7900 32.86 50.83 1.3642 60.8285
0.1384 0.8758 8000 33.47 49.61 1.3860 59.8829
0.1128 0.8867 8100 32.78 50.04 1.3840 61.8190
0.1197 0.8976 8200 33.69 50.94 1.3641 61.8190
0.1181 0.9086 8300 32.0 49.65 1.3913 63.5299
0.0866 0.9195 8400 30.39 48.48 1.4171 68.0324
0.0784 0.9305 8500 32.27 49.32 1.3850 63.3949
0.092 0.9414 8600 33.78 51.13 1.3880 61.2787
0.0685 0.9524 8700 34.33 51.23 1.3876 61.1887
0.0783 0.9633 8800 33.4 48.9 1.4010 62.5844
0.0735 0.9743 8900 33.72 49.01 1.4035 61.5038
0.0875 0.9852 9000 30.44 49.06 1.4064 67.5371
0.0822 0.9962 9100 34.64 51.51 1.3803 60.5133
0.041 1.0071 9200 34.66 52.06 1.3678 59.4327
0.0351 1.0181 9300 33.88 51.16 1.3739 61.3688
0.0368 1.0290 9400 35.2 51.73 1.3846 60.4232
0.035 1.0400 9500 34.23 51.32 1.3753 60.8735
0.0277 1.0509 9600 35.0 52.59 1.3788 60.0180
0.0247 1.0619 9700 34.69 51.7 1.3914 60.2882
0.0321 1.0728 9800 34.63 51.91 1.3804 60.6033
0.0286 1.0837 9900 33.92 51.64 1.3795 61.8640
0.0239 1.0947 10000 33.79 51.67 1.3818 61.6839
0.085 1.1056 10100 1.5082 26.54 46.14 70.9140
0.1002 1.1166 10200 1.5156 31.06 47.27 64.7006
0.1144 1.1275 10300 1.5837 24.93 44.33 73.3003
0.1137 1.1385 10400 1.5372 28.96 47.2 65.7812
0.1182 1.1494 10500 1.5366 30.05 47.09 65.6461
0.1214 1.1604 10600 1.5160 26.83 46.73 70.2386
0.1413 1.1713 10700 1.5384 27.92 47.04 70.3287
0.1011 1.1823 10800 1.5791 27.71 46.13 70.5538
0.1187 1.1932 10900 1.6188 22.91 44.41 81.8100
0.1364 1.2042 11000 1.5807 29.38 45.46 67.3570
0.1158 1.2151 11100 1.5819 25.33 44.25 76.0919
0.1199 1.2261 11200 1.5727 27.52 46.0 68.3926
0.1213 1.2370 11300 1.5728 26.92 45.92 69.4732
0.1291 1.2479 11400 1.5743 28.13 44.72 67.9874
0.131 1.2589 11500 1.5337 29.42 48.65 66.6367
0.1279 1.2698 11600 1.6752 22.78 43.45 79.2436
0.116 1.2808 11700 1.6056 26.91 45.46 70.4638
0.1126 1.2917 11800 1.6341 26.11 45.71 70.5988
0.1263 1.3027 11900 1.6231 28.31 46.34 69.6983
0.1072 1.3136 12000 1.5580 30.51 47.66 65.5110
0.115 1.3246 12100 1.5944 28.13 46.39 68.2575
0.1014 1.3355 12200 1.5486 28.75 47.01 67.9874
0.1149 1.3465 12300 1.5973 29.4 46.21 67.4021
0.1131 1.3574 12400 1.5769 29.94 48.17 65.1959
0.1032 1.3684 12500 1.6363 25.02 47.46 78.4331
0.1103 1.3793 12600 1.6057 28.2 45.82 68.7528
0.109 1.3903 12700 1.5884 28.0 45.95 69.1130
0.0927 1.4012 12800 1.5881 29.86 47.67 67.5371
0.0829 1.4122 12900 1.5855 29.15 45.76 67.6722
0.0955 1.4231 13000 1.6313 27.59 46.3 69.7884
0.0874 1.4340 13100 1.6173 30.52 45.99 64.5205
0.0816 1.4450 13200 1.5864 30.89 47.34 65.1959
0.0836 1.4559 13300 1.6319 30.18 47.3 65.5110
0.0832 1.4669 13400 1.6353 28.85 47.8 67.3570
0.0622 1.4778 13500 1.6117 28.91 46.78 69.0680
0.0689 1.4888 13600 1.5919 31.32 47.68 64.7006
0.0921 1.4997 13700 1.6180 30.09 46.8 66.8167
0.0754 1.5107 13800 1.5755 30.77 47.55 64.4755
0.0844 1.5216 13900 1.5681 31.12 48.9 64.9707
0.0696 1.5326 14000 1.5481 31.27 49.63 64.0252
0.0914 1.5435 14100 1.5603 29.61 47.28 65.8262
0.0789 1.5545 14200 1.5896 31.85 48.49 62.8546
0.0572 1.5654 14300 1.5931 27.55 46.87 69.1580
0.0619 1.5764 14400 1.5988 28.82 46.89 66.4566
0.0566 1.5873 14500 1.5838 29.97 48.6 66.6817
0.0661 1.5982 14600 1.6447 30.47 46.25 65.5561
0.0607 1.6092 14700 1.5621 32.31 48.53 63.1247
0.0566 1.6201 14800 1.5838 31.08 49.25 66.3665
0.0354 1.6311 14900 1.5723 30.73 48.58 66.3215
0.0585 1.6420 15000 1.5825 29.84 47.27 66.5466
0.0542 1.6530 15100 1.6012 28.7 47.32 69.8784
0.0641 1.6639 15200 1.5662 28.8 47.35 67.4021
0.0588 1.6749 15300 1.5596 30.62 47.51 65.9613
0.0401 1.6858 15400 1.5719 30.74 48.55 66.0513
0.043 1.6968 15500 1.5979 29.37 47.1 69.3381
0.0384 1.7077 15600 1.5718 29.56 48.58 68.7978
0.0399 1.7187 15700 1.5873 30.84 47.72 66.9518
0.0437 1.7296 15800 1.5493 31.43 48.06 65.7812
0.0473 1.7406 15900 1.5185 31.4 49.44 64.1153
0.0477 1.7515 16000 1.5695 32.89 49.91 62.0891
0.0435 1.7625 16100 1.5550 32.99 50.86 62.0891
0.0478 1.7734 16200 1.5662 31.31 49.58 63.5750
0.0337 1.7843 16300 1.5792 31.22 49.44 63.8901
0.0387 1.7953 16400 1.5715 32.2 49.5 62.3143
0.0376 1.8062 16500 1.5783 31.95 49.34 64.0252
0.0357 1.8172 16600 1.5684 30.67 49.15 65.8712
0.028 1.8281 16700 1.5544 29.78 48.23 67.5822
0.0185 1.8391 16800 1.5419 31.16 50.1 64.0252
0.0408 1.8500 16900 1.5504 31.25 49.53 64.6556
0.0259 1.8610 17000 1.5501 30.73 50.13 68.3026
0.0276 1.8719 17100 1.5359 29.97 49.96 69.0680
0.0201 1.8829 17200 1.5504 32.06 51.35 63.2598
0.0268 1.8938 17300 1.5322 34.47 51.53 60.0180
0.023 1.9048 17400 1.5407 34.65 52.44 59.5678
0.0138 1.9157 17500 1.5653 33.97 51.65 61.5038
0.0164 1.9267 17600 1.5470 32.62 50.82 64.6556
0.018 1.9376 17700 1.5450 33.09 50.06 61.9541
0.0246 1.9485 17800 1.5473 31.67 50.21 65.1058
0.0196 1.9595 17900 1.5414 31.1 50.36 66.9068
0.0163 1.9704 18000 1.5453 32.63 50.62 62.9446
0.018 1.9814 18100 1.5361 32.99 50.91 61.2337
0.025 1.9923 18200 1.5394 33.18 50.61 61.2787
0.0081 2.0033 18300 1.5481 33.39 50.63 61.1436
0.0147 2.0142 18400 1.5500 33.39 51.22 61.3688
0.0121 2.0252 18500 1.5515 32.35 50.74 62.0441
0.0154 2.0361 18600 1.5524 31.18 50.35 66.0964
0.0157 2.0471 18700 1.5585 31.68 50.44 65.4660
0.0089 2.0580 18800 1.5559 33.33 50.68 61.2337
0.0112 2.0690 18900 1.5552 31.64 50.43 65.1959
0.0109 2.0799 19000 1.5548 31.69 50.38 64.9707

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.2.0+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
10
Safetensors
Model size
764M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for ymoslem/whisper-medium-ga2en-v5.3.2-19k-r

Finetuned
(445)
this model

Datasets used to train ymoslem/whisper-medium-ga2en-v5.3.2-19k-r

Evaluation results

  • Bleu on IWSLT-2023, FLEURS, BiteSize, SpokenWords, Tatoeba, and Wikimedia
    self-reported
    31.690
  • Wer on IWSLT-2023, FLEURS, BiteSize, SpokenWords, Tatoeba, and Wikimedia
    self-reported
    64.971