w2v-bert-2.0-chichewa_34_136h

This model is a fine-tuned version of facebook/w2v-bert-2.0 on the CLEAR-GLOBAL/CHICHEWA_34_136H - NA dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2952
  • Wer: 0.4020
  • Cer: 0.1153

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 64
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 100000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
2.7429 0.6122 1000 2.9154 0.9860 0.8820
0.1586 1.2241 2000 0.7989 0.6341 0.1888
0.0475 1.8362 3000 0.7777 0.5725 0.1637
0.0452 2.4481 4000 0.4482 0.5083 0.1482
0.0387 3.0600 5000 0.4168 0.4770 0.1396
0.0454 3.6722 6000 0.3792 0.4501 0.1306
0.0215 4.2841 7000 0.3758 0.4564 0.1324
0.0342 4.8962 8000 0.3737 0.4557 0.1298
0.0243 5.5081 9000 0.3805 0.4325 0.1252
0.0183 6.1200 10000 0.3490 0.4257 0.1240
0.0253 6.7322 11000 0.3670 0.4185 0.1199
0.0115 7.3440 12000 0.3664 0.4125 0.1207
0.0141 7.9562 13000 0.2952 0.4021 0.1153
0.0141 8.5681 14000 0.3231 0.4031 0.1133
0.0082 9.1800 15000 0.3209 0.4000 0.1141
0.0214 9.7922 16000 0.3115 0.3985 0.1134
0.0146 10.4040 17000 0.3092 0.3743 0.1089
0.0367 11.0159 18000 0.3207 0.3914 0.1153

Framework versions

  • Transformers 4.48.1
  • Pytorch 2.6.0+cu124
  • Datasets 3.5.0
  • Tokenizers 0.21.1
Downloads last month
8
Safetensors
Model size
606M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for CLEAR-Global/w2v-bert-2.0-chichewa_34_136h

Finetuned
(302)
this model

Collection including CLEAR-Global/w2v-bert-2.0-chichewa_34_136h