wav2vec2-xls-r-1b-dutch-1

This model is a fine-tuned version of facebook/wav2vec2-xls-r-1b on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4583
  • Wer: 0.3999

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 32
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.02
  • num_epochs: 20
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
2.8991 0.55 100 2.9364 1.0000
2.7986 1.09 200 2.8196 0.9999
1.8336 1.64 300 1.1436 0.7846
1.1463 2.19 400 0.7366 0.4903
0.9671 2.73 500 0.6065 0.4828
0.8179 3.28 600 0.6345 0.4917
0.6773 3.83 700 0.5766 0.4867
0.8431 4.37 800 0.6140 0.4933
0.6788 4.92 900 0.5155 0.4624
0.6595 5.46 1000 0.5325 0.4545
0.8755 6.01 1100 0.5265 0.4610
0.4818 6.56 1200 0.5042 0.4385
0.4767 7.1 1300 0.5227 0.4374
0.9617 7.65 1400 0.5348 0.4347
0.5603 8.2 1500 0.5012 0.4332
0.955 8.74 1600 0.5112 0.4290
0.5913 9.29 1700 0.4789 0.4211
0.7202 9.84 1800 0.4987 0.4197
0.5045 10.38 1900 0.4995 0.4252
0.4371 10.93 2000 0.4892 0.4171
0.3178 11.48 2100 0.4852 0.4114
0.3184 12.02 2200 0.4781 0.4156
0.2477 12.57 2300 0.4767 0.4110
0.3052 13.11 2400 0.4650 0.4082
0.3833 13.66 2500 0.4639 0.4043
0.3854 14.21 2600 0.4725 0.4075
0.2411 14.75 2700 0.4773 0.4040
0.3219 15.3 2800 0.4619 0.4021
0.241 15.85 2900 0.4490 0.4022
0.2043 16.39 3000 0.4624 0.3998
0.211 16.94 3100 0.4669 0.4036
0.1777 17.49 3200 0.4657 0.4010
0.1909 18.03 3300 0.4552 0.4003
0.2134 18.58 3400 0.4584 0.4004
0.1713 19.13 3500 0.4578 0.4000
0.1634 19.67 3600 0.4583 0.3999

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.7.0+cu126
  • Datasets 2.16.1
  • Tokenizers 0.15.2
Downloads last month
5
Safetensors
Model size
963M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for golesheed/wav2vec2-xls-r-1b-dutch-1

Finetuned
(117)
this model