w2v-bert-2.0-igbo_naijavoices_1h
This model is a fine-tuned version of facebook/w2v-bert-2.0 on the NAIJAVOICES_IGBO_1H - NA dataset. It achieves the following results on the evaluation set:
- Loss: 0.8551
- Wer: 0.6458
- Cer: 0.2352
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 160
- eval_batch_size: 160
- seed: 42
- distributed_type: multi-GPU
- num_devices: 2
- total_train_batch_size: 320
- total_eval_batch_size: 320
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 25000.0
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
6.2706 | 33.3333 | 100 | 6.3346 | 1.8011 | 1.0293 |
3.6487 | 66.6667 | 200 | 3.6301 | 0.9877 | 0.8946 |
3.0158 | 100.0 | 300 | 3.0263 | 1.0 | 1.0000 |
2.3833 | 133.3333 | 400 | 2.2797 | 1.0013 | 0.6634 |
1.256 | 166.6667 | 500 | 1.1780 | 0.8874 | 0.3541 |
0.8752 | 200.0 | 600 | 0.9456 | 0.7528 | 0.2749 |
0.6649 | 233.3333 | 700 | 0.8731 | 0.6848 | 0.2482 |
0.52 | 266.6667 | 800 | 0.8531 | 0.6449 | 0.2352 |
0.411 | 300.0 | 900 | 0.8922 | 0.6298 | 0.2294 |
0.26 | 333.3333 | 1000 | 0.9620 | 0.6131 | 0.2253 |
0.1774 | 366.6667 | 1100 | 1.0638 | 0.6135 | 0.2232 |
0.1402 | 400.0 | 1200 | 1.1608 | 0.6048 | 0.2210 |
0.0775 | 433.3333 | 1300 | 1.2847 | 0.6022 | 0.2184 |
Framework versions
- Transformers 4.48.1
- Pytorch 2.7.1+cu126
- Datasets 4.0.0
- Tokenizers 0.21.2
- Downloads last month
- -
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for CLEAR-Global/w2v-bert-2.0-igbo_naijavoices_1h
Base model
facebook/w2v-bert-2.0