ayymen's picture
End of training
5df9298 verified
metadata
library_name: transformers
license: mit
base_model: facebook/w2v-bert-2.0
tags:
  - automatic-speech-recognition
  - CLEAR-Global/luo_19_77h
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: w2v-bert-2.0-luo_19_77h
    results: []

w2v-bert-2.0-luo_19_77h

This model is a fine-tuned version of facebook/w2v-bert-2.0 on the CLEAR-GLOBAL/LUO_19_77H - NA dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2419
  • Wer: 0.2906
  • Cer: 0.0898

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 64
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 100000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
0.469 0.8446 1000 0.8816 0.6085 0.1993
0.1171 1.6892 2000 0.6389 0.4184 0.1531
0.0837 2.5338 3000 0.5226 0.3805 0.1395
0.1017 3.3784 4000 0.3857 0.3532 0.1133
0.036 4.2230 5000 0.3766 0.3457 0.1161
0.0509 5.0676 6000 0.3433 0.3408 0.1149
0.0444 5.9122 7000 0.2983 0.3082 0.0981
0.0704 6.7568 8000 0.2803 0.2972 0.0968
0.0516 7.6014 9000 0.3242 0.2932 0.1006
0.0362 8.4459 10000 0.2760 0.3047 0.0967
0.0253 9.2905 11000 0.2727 0.2782 0.0908
0.026 10.1351 12000 0.2789 0.2959 0.1049
0.0274 10.9797 13000 0.2542 0.2782 0.0922
0.0218 11.8243 14000 0.2694 0.2646 0.0904
0.0201 12.6689 15000 0.2575 0.3007 0.0922
0.0201 13.5135 16000 0.2419 0.2901 0.0896
0.0216 14.3581 17000 0.2478 0.2795 0.0933
0.0079 15.2027 18000 0.2974 0.2844 0.0890
0.0352 16.0473 19000 0.2596 0.2959 0.0930
0.0302 16.8919 20000 0.2831 0.2491 0.0849
0.0115 17.7365 21000 0.2966 0.2751 0.0920

Framework versions

  • Transformers 4.48.1
  • Pytorch 2.6.0+cu124
  • Datasets 3.5.0
  • Tokenizers 0.21.1