Configuration Parsing
Warning:
In adapter_config.json: "peft.task_type" must be a string
ModelEcho
This model is a fine-tuned version of gigant/whisper-medium-romanian on the arrow dataset. It achieves the following results on the evaluation set:
- Loss: 0.1863
- Wer: 10.5222
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- training_steps: 5000
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
0.3049 | 0.2453 | 500 | 0.2264 | 12.3599 |
0.2764 | 0.4907 | 1000 | 0.2071 | 11.3205 |
0.2792 | 0.7360 | 1500 | 0.1992 | 11.0228 |
0.2504 | 0.9814 | 2000 | 0.1945 | 10.7824 |
0.2424 | 1.2267 | 2500 | 0.1918 | 10.6759 |
0.249 | 1.4720 | 3000 | 0.1896 | 10.6031 |
0.244 | 1.7174 | 3500 | 0.1880 | 10.5701 |
0.2304 | 1.9627 | 4000 | 0.1868 | 10.4709 |
0.2326 | 2.2080 | 4500 | 0.1866 | 10.5371 |
0.2204 | 2.4534 | 5000 | 0.1863 | 10.5222 |
Framework versions
- PEFT 0.16.0
- Transformers 4.53.2
- Pytorch 2.7.1+cu118
- Datasets 4.0.0
- Tokenizers 0.21.2
- Downloads last month
- 11
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support