cantillation's picture
End of training
419adb6 verified
---
library_name: transformers
language:
- he
license: mit
base_model: openai/whisper-large-v3-turbo
tags:
- hf-asr-leaderboard
- generated_from_trainer
metrics:
- wer
model-index:
- name: he-cantillation
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# he-cantillation
This model is a fine-tuned version of [openai/whisper-large-v3-turbo](https://huggingface.co/openai/whisper-large-v3-turbo) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 5.0610
- Wer: 96.8708
- Avg Precision Exact: 0.0606
- Avg Recall Exact: 0.1083
- Avg F1 Exact: 0.0750
- Avg Precision Letter Shift: 0.0800
- Avg Recall Letter Shift: 0.1442
- Avg F1 Letter Shift: 0.0985
- Avg Precision Word Level: 0.0945
- Avg Recall Word Level: 0.1679
- Avg F1 Word Level: 0.1153
- Avg Precision Word Shift: 0.1901
- Avg Recall Word Shift: 0.3552
- Avg F1 Word Shift: 0.2367
- Precision Median Exact: 0.0294
- Recall Median Exact: 0.0667
- F1 Median Exact: 0.04
- Precision Max Exact: 1.0
- Recall Max Exact: 1.0
- F1 Max Exact: 1.0
- Precision Min Exact: 0.0
- Recall Min Exact: 0.0
- F1 Min Exact: 0.0
- Precision Min Letter Shift: 0.0
- Recall Min Letter Shift: 0.0
- F1 Min Letter Shift: 0.0
- Precision Min Word Level: 0.0
- Recall Min Word Level: 0.0
- F1 Min Word Level: 0.0
- Precision Min Word Shift: 0.0
- Recall Min Word Shift: 0.0
- F1 Min Word Shift: 0.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 2
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- training_steps: 60000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer | Avg Precision Exact | Avg Recall Exact | Avg F1 Exact | Avg Precision Letter Shift | Avg Recall Letter Shift | Avg F1 Letter Shift | Avg Precision Word Level | Avg Recall Word Level | Avg F1 Word Level | Avg Precision Word Shift | Avg Recall Word Shift | Avg F1 Word Shift | Precision Median Exact | Recall Median Exact | F1 Median Exact | Precision Max Exact | Recall Max Exact | F1 Max Exact | Precision Min Exact | Recall Min Exact | F1 Min Exact | Precision Min Letter Shift | Recall Min Letter Shift | F1 Min Letter Shift | Precision Min Word Level | Recall Min Word Level | F1 Min Word Level | Precision Min Word Shift | Recall Min Word Shift | F1 Min Word Shift |
|:-------------:|:------:|:-----:|:---------------:|:--------:|:-------------------:|:----------------:|:------------:|:--------------------------:|:-----------------------:|:-------------------:|:------------------------:|:---------------------:|:-----------------:|:------------------------:|:---------------------:|:-----------------:|:----------------------:|:-------------------:|:---------------:|:-------------------:|:----------------:|:------------:|:-------------------:|:----------------:|:------------:|:--------------------------:|:-----------------------:|:-------------------:|:------------------------:|:---------------------:|:-----------------:|:------------------------:|:---------------------:|:-----------------:|
| No log | 0.0002 | 1 | 6.2380 | 110.8685 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 0.0645 | 0.3754 | 2500 | 2.4574 | 95.0107 | 0.0821 | 0.0867 | 0.0838 | 0.1114 | 0.1192 | 0.1142 | 0.1349 | 0.1431 | 0.1376 | 0.2944 | 0.3265 | 0.3067 | 0.0449 | 0.0526 | 0.0476 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0342 | 0.7508 | 5000 | 2.7722 | 95.1289 | 0.0845 | 0.0930 | 0.0876 | 0.1130 | 0.1269 | 0.1180 | 0.1355 | 0.1519 | 0.1411 | 0.2829 | 0.3349 | 0.3018 | 0.04 | 0.0533 | 0.0460 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0227 | 1.1261 | 7500 | 3.0372 | 96.0378 | 0.0695 | 0.0938 | 0.0777 | 0.0945 | 0.1296 | 0.1060 | 0.1145 | 0.1560 | 0.1277 | 0.2386 | 0.3426 | 0.2719 | 0.0333 | 0.0556 | 0.04 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0134 | 1.5015 | 10000 | 3.3407 | 97.1990 | 0.0528 | 0.0844 | 0.0625 | 0.0705 | 0.1139 | 0.0835 | 0.0852 | 0.1368 | 0.1005 | 0.1785 | 0.2988 | 0.2142 | 0.0270 | 0.0526 | 0.0357 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0164 | 1.8769 | 12500 | 3.4023 | 97.1348 | 0.0502 | 0.0782 | 0.0587 | 0.0708 | 0.1122 | 0.0829 | 0.0883 | 0.1383 | 0.1023 | 0.1950 | 0.3205 | 0.2307 | 0.0278 | 0.05 | 0.0357 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0099 | 2.2523 | 15000 | 3.5449 | 96.5002 | 0.0636 | 0.1103 | 0.0774 | 0.0818 | 0.1446 | 0.1002 | 0.0959 | 0.1688 | 0.1172 | 0.1927 | 0.3540 | 0.2395 | 0.0303 | 0.0667 | 0.0408 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0087 | 2.6276 | 17500 | 3.6473 | 96.9131 | 0.0629 | 0.1068 | 0.0760 | 0.0830 | 0.1421 | 0.1000 | 0.0969 | 0.1655 | 0.1164 | 0.1956 | 0.3517 | 0.2395 | 0.0303 | 0.0625 | 0.04 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0192 | 3.0030 | 20000 | 4.0472 | 97.3799 | 0.0532 | 0.0979 | 0.0664 | 0.0724 | 0.1334 | 0.0898 | 0.0878 | 0.1609 | 0.1084 | 0.1819 | 0.3434 | 0.2263 | 0.0286 | 0.0625 | 0.0392 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0075 | 3.3784 | 22500 | 4.1235 | 97.3449 | 0.0485 | 0.0950 | 0.0614 | 0.0671 | 0.1345 | 0.0852 | 0.0809 | 0.1608 | 0.1024 | 0.1666 | 0.3439 | 0.2129 | 0.025 | 0.0625 | 0.0364 | 1.0 | 1.0 | 0.9091 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0049 | 3.7538 | 25000 | 4.0555 | 97.9795 | 0.0442 | 0.0939 | 0.0568 | 0.0616 | 0.1317 | 0.0787 | 0.0754 | 0.1577 | 0.0946 | 0.1565 | 0.3412 | 0.2004 | 0.0227 | 0.0588 | 0.0351 | 1.0 | 1.0 | 0.8571 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0022 | 4.1291 | 27500 | 4.0404 | 97.6337 | 0.0503 | 0.0884 | 0.0615 | 0.0700 | 0.1253 | 0.0861 | 0.0847 | 0.1503 | 0.1037 | 0.1824 | 0.3361 | 0.2259 | 0.0238 | 0.0556 | 0.0345 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0016 | 4.5045 | 30000 | 4.4676 | 97.8424 | 0.0495 | 0.0885 | 0.0608 | 0.0685 | 0.1239 | 0.0839 | 0.0807 | 0.1467 | 0.0991 | 0.1739 | 0.3305 | 0.2164 | 0.025 | 0.0588 | 0.0364 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0031 | 4.8799 | 32500 | 3.8982 | 97.0867 | 0.0638 | 0.1024 | 0.0758 | 0.0841 | 0.1362 | 0.0998 | 0.0999 | 0.1615 | 0.1181 | 0.2033 | 0.3429 | 0.2441 | 0.0294 | 0.0588 | 0.0392 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0037 | 5.2553 | 35000 | 5.1029 | 97.8978 | 0.0483 | 0.0917 | 0.0605 | 0.0670 | 0.1264 | 0.0828 | 0.0815 | 0.1520 | 0.0997 | 0.1718 | 0.3367 | 0.2144 | 0.0256 | 0.0625 | 0.0370 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0018 | 5.6306 | 37500 | 4.3351 | 97.4514 | 0.0476 | 0.0950 | 0.0607 | 0.0646 | 0.1297 | 0.0820 | 0.0775 | 0.1547 | 0.0979 | 0.1635 | 0.3385 | 0.2094 | 0.025 | 0.0625 | 0.0370 | 1.0 | 1.0 | 0.8571 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0011 | 6.0060 | 40000 | 4.4936 | 97.1275 | 0.0539 | 0.1114 | 0.0696 | 0.0727 | 0.1496 | 0.0929 | 0.0866 | 0.1756 | 0.1098 | 0.1715 | 0.3579 | 0.2189 | 0.0267 | 0.0714 | 0.0385 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0005 | 6.3814 | 42500 | 4.5198 | 96.9962 | 0.0602 | 0.1078 | 0.0743 | 0.0779 | 0.1423 | 0.0969 | 0.0929 | 0.1661 | 0.1138 | 0.1893 | 0.3525 | 0.2340 | 0.0294 | 0.0667 | 0.0408 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.001 | 6.7568 | 45000 | 4.8643 | 97.2647 | 0.0509 | 0.0968 | 0.0640 | 0.0686 | 0.1322 | 0.0862 | 0.0823 | 0.1559 | 0.1023 | 0.1718 | 0.3380 | 0.2151 | 0.0256 | 0.0625 | 0.0364 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0003 | 7.1321 | 47500 | 4.7786 | 97.2238 | 0.0546 | 0.1022 | 0.0680 | 0.0735 | 0.1403 | 0.0918 | 0.0869 | 0.1647 | 0.1081 | 0.1774 | 0.3502 | 0.2227 | 0.0263 | 0.0625 | 0.0377 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0013 | 7.5075 | 50000 | 4.5187 | 96.5775 | 0.0600 | 0.1019 | 0.0726 | 0.0785 | 0.1357 | 0.0954 | 0.0934 | 0.1601 | 0.1125 | 0.1956 | 0.3501 | 0.2394 | 0.0286 | 0.0625 | 0.0392 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0005 | 7.8829 | 52500 | 4.7046 | 97.0517 | 0.0573 | 0.0999 | 0.0703 | 0.0759 | 0.1339 | 0.0930 | 0.0906 | 0.1595 | 0.1109 | 0.1874 | 0.3438 | 0.2321 | 0.0286 | 0.0625 | 0.0385 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0001 | 8.2583 | 55000 | 5.3282 | 97.2296 | 0.0536 | 0.1044 | 0.0676 | 0.0716 | 0.1416 | 0.0901 | 0.0847 | 0.1667 | 0.1063 | 0.1682 | 0.3478 | 0.2142 | 0.025 | 0.0667 | 0.0370 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0 | 8.6336 | 57500 | 5.3362 | 96.9743 | 0.0589 | 0.1081 | 0.0734 | 0.0757 | 0.1421 | 0.0950 | 0.0887 | 0.1642 | 0.1103 | 0.1800 | 0.3481 | 0.2266 | 0.0286 | 0.0667 | 0.0408 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 0.0001 | 9.0090 | 60000 | 5.0610 | 96.8708 | 0.0606 | 0.1083 | 0.0750 | 0.0800 | 0.1442 | 0.0985 | 0.0945 | 0.1679 | 0.1153 | 0.1901 | 0.3552 | 0.2367 | 0.0294 | 0.0667 | 0.04 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
### Framework versions
- Transformers 4.49.0
- Pytorch 2.7.0+cu126
- Datasets 2.12.0
- Tokenizers 0.20.1