whisper-small-karelian-cs-w-rus
This model is a fine-tuned version of openai/whisper-small on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.7827
- Wer: 0.4393
- Cer: 0.1569
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- distributed_type: multi-GPU
- gradient_accumulation_steps: 8
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- training_steps: 10000
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
0.4436 | 0.4813 | 500 | 0.7520 | 0.5127 | 0.1571 |
0.2601 | 0.9627 | 1000 | 0.6816 | 0.4746 | 0.1602 |
0.2059 | 1.4438 | 1500 | 0.7251 | 0.4717 | 0.1724 |
0.1789 | 1.9252 | 2000 | 0.6968 | 0.4551 | 0.1674 |
0.1262 | 2.4063 | 2500 | 0.6831 | 0.4475 | 0.1415 |
0.0987 | 2.8876 | 3000 | 0.6758 | 0.4227 | 0.1353 |
0.0693 | 3.3687 | 3500 | 0.7007 | 0.4426 | 0.1382 |
0.059 | 3.8501 | 4000 | 0.6895 | 0.4195 | 0.1354 |
0.0429 | 4.3312 | 4500 | 0.6934 | 0.4276 | 0.1565 |
0.0411 | 4.8125 | 5000 | 0.7100 | 0.4270 | 0.1336 |
0.0414 | 5.2936 | 5500 | 0.7505 | 0.4455 | 0.1430 |
0.0424 | 5.7750 | 6000 | 0.7456 | 0.4683 | 0.1723 |
0.0407 | 6.2561 | 6500 | 0.7290 | 0.4372 | 0.1452 |
0.034 | 6.7374 | 7000 | 0.7417 | 0.4290 | 0.1361 |
0.0334 | 7.2185 | 7500 | 0.7427 | 0.4310 | 0.1410 |
0.028 | 7.6999 | 8000 | 0.7573 | 0.4149 | 0.1343 |
0.0216 | 8.1810 | 8500 | 0.7471 | 0.4185 | 0.1368 |
0.0197 | 8.6623 | 9000 | 0.7490 | 0.4220 | 0.1383 |
0.0244 | 9.1434 | 9500 | 0.7671 | 0.4301 | 0.1423 |
0.0219 | 9.6248 | 10000 | 0.7827 | 0.4393 | 0.1569 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.7.0+cu126
- Datasets 3.5.1
- Tokenizers 0.21.1
- Downloads last month
- 19
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for Mihaj/whisper-small-karelian-cs-w-rus
Base model
openai/whisper-small