wav2vec2-ft-datastf5-1lre6-adm-ga2b16-st30k-pat3
This model is a fine-tuned version of jonatasgrosman/wav2vec2-large-english on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 3.0567
- Wer: 0.9996
- Cer: 0.9865
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-06
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1500
- training_steps: 30000
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
25.6288 | 0.6200 | 500 | 21.6325 | 1.8808 | 1.4695 |
5.6781 | 1.2393 | 1000 | 3.9369 | 1.0045 | 0.9887 |
3.8734 | 1.8593 | 1500 | 3.3859 | 0.9998 | 0.9861 |
3.407 | 2.4786 | 2000 | 3.3293 | 0.9996 | 0.9864 |
3.4348 | 3.0980 | 2500 | 3.2041 | 0.9996 | 0.9865 |
3.2992 | 3.7179 | 3000 | 3.1785 | 0.9996 | 0.9865 |
3.2306 | 4.3373 | 3500 | 3.1569 | 0.9996 | 0.9865 |
3.151 | 4.9572 | 4000 | 3.1325 | 0.9996 | 0.9865 |
3.1182 | 5.5766 | 4500 | 3.1203 | 0.9996 | 0.9865 |
3.1398 | 6.1959 | 5000 | 3.1219 | 0.9996 | 0.9865 |
3.1191 | 6.8159 | 5500 | 3.1054 | 0.9996 | 0.9865 |
3.1332 | 7.4352 | 6000 | 3.0948 | 0.9996 | 0.9865 |
3.0845 | 8.0546 | 6500 | 3.0861 | 0.9996 | 0.9865 |
3.0783 | 8.6745 | 7000 | 3.0861 | 0.9996 | 0.9865 |
3.0583 | 9.2939 | 7500 | 3.0810 | 0.9996 | 0.9865 |
3.1264 | 9.9138 | 8000 | 3.0866 | 0.9996 | 0.9865 |
3.0891 | 10.5332 | 8500 | 3.0841 | 0.9996 | 0.9865 |
3.0803 | 11.1525 | 9000 | 3.0768 | 0.9996 | 0.9865 |
3.1139 | 11.7725 | 9500 | 3.0600 | 0.9996 | 0.9865 |
3.0704 | 12.3918 | 10000 | 3.0682 | 0.9996 | 0.9865 |
2.9783 | 13.0112 | 10500 | 3.0559 | 0.9996 | 0.9865 |
3.0563 | 13.6311 | 11000 | 3.0641 | 0.9996 | 0.9865 |
3.0726 | 14.2505 | 11500 | 3.0537 | 0.9996 | 0.9865 |
3.0792 | 14.8704 | 12000 | 3.0560 | 0.9996 | 0.9865 |
3.0304 | 15.4898 | 12500 | 3.0579 | 0.9996 | 0.9865 |
3.0168 | 16.1091 | 13000 | 3.0567 | 0.9996 | 0.9865 |
Framework versions
- Transformers 4.52.3
- Pytorch 2.7.0+cu118
- Datasets 3.6.0
- Tokenizers 0.21.1
- Downloads last month
- 10
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for HouraMor/wav2vec2-ft-datastf5-1lre6-adm-ga2b16-st30k-pat3
Base model
jonatasgrosman/wav2vec2-large-english