File size: 7,403 Bytes
4a73725
 
0d1ee9f
 
4a73725
 
 
0d1ee9f
4a73725
 
 
 
0d1ee9f
4a73725
 
 
 
 
 
0d1ee9f
4a73725
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
---
library_name: transformers
language:
- he
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- hf-asr-leaderboard
- generated_from_trainer
metrics:
- wer
model-index:
- name: he-cantillation
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# he-cantillation

This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 4.7623
- Wer: 98.8781
- Avg Precision Exact: 0.0291
- Avg Recall Exact: 0.0505
- Avg F1 Exact: 0.0356
- Avg Precision Letter Shift: 0.0493
- Avg Recall Letter Shift: 0.0876
- Avg F1 Letter Shift: 0.0605
- Avg Precision Word Level: 0.0685
- Avg Recall Word Level: 0.1200
- Avg F1 Word Level: 0.0833
- Avg Precision Word Shift: 0.1707
- Avg Recall Word Shift: 0.3059
- Avg F1 Word Shift: 0.2088
- Precision Median Exact: 0.0
- Recall Median Exact: 0.0
- F1 Median Exact: 0.0
- Precision Max Exact: 0.4444
- Recall Max Exact: 1.0
- F1 Max Exact: 0.4286
- Precision Min Exact: 0.0
- Recall Min Exact: 0.0
- F1 Min Exact: 0.0
- Precision Min Letter Shift: 0.0
- Recall Min Letter Shift: 0.0
- F1 Min Letter Shift: 0.0
- Precision Min Word Level: 0.0
- Recall Min Word Level: 0.0
- F1 Min Word Level: 0.0
- Precision Min Word Shift: 0.0
- Recall Min Word Shift: 0.0
- F1 Min Word Shift: 0.0

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 2
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- training_steps: 100000
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch   | Step   | Validation Loss | Wer     | Avg Precision Exact | Avg Recall Exact | Avg F1 Exact | Avg Precision Letter Shift | Avg Recall Letter Shift | Avg F1 Letter Shift | Avg Precision Word Level | Avg Recall Word Level | Avg F1 Word Level | Avg Precision Word Shift | Avg Recall Word Shift | Avg F1 Word Shift | Precision Median Exact | Recall Median Exact | F1 Median Exact | Precision Max Exact | Recall Max Exact | F1 Max Exact | Precision Min Exact | Recall Min Exact | F1 Min Exact | Precision Min Letter Shift | Recall Min Letter Shift | F1 Min Letter Shift | Precision Min Word Level | Recall Min Word Level | F1 Min Word Level | Precision Min Word Shift | Recall Min Word Shift | F1 Min Word Shift |
|:-------------:|:-------:|:------:|:---------------:|:-------:|:-------------------:|:----------------:|:------------:|:--------------------------:|:-----------------------:|:-------------------:|:------------------------:|:---------------------:|:-----------------:|:------------------------:|:---------------------:|:-----------------:|:----------------------:|:-------------------:|:---------------:|:-------------------:|:----------------:|:------------:|:-------------------:|:----------------:|:------------:|:--------------------------:|:-----------------------:|:-------------------:|:------------------------:|:---------------------:|:-----------------:|:------------------------:|:---------------------:|:-----------------:|
| 0.0276        | 3.0030  | 20000  | 3.0853          | 98.7760 | 0.0336              | 0.0472           | 0.0380       | 0.0581                     | 0.0831                  | 0.0658              | 0.0801                   | 0.1144                | 0.0904            | 0.2032                   | 0.3029                | 0.2334            | 0.0213                 | 0.0303              | 0.0267          | 0.6667              | 0.6667           | 0.6667       | 0.0                 | 0.0              | 0.0          | 0.0                        | 0.0                     | 0.0                 | 0.0                      | 0.0                   | 0.0               | 0.0                      | 0.0                   | 0.0               |
| 0.0026        | 6.0060  | 40000  | 3.8257          | 98.8227 | 0.0303              | 0.0497           | 0.0364       | 0.0528                     | 0.0880                  | 0.0637              | 0.0728                   | 0.1208                | 0.0877            | 0.1812                   | 0.3095                | 0.2204            | 0.0                    | 0.0                 | 0.0             | 0.75                | 0.75             | 0.75         | 0.0                 | 0.0              | 0.0          | 0.0                        | 0.0                     | 0.0                 | 0.0                      | 0.0                   | 0.0               | 0.0                      | 0.0                   | 0.0               |
| 0.0004        | 9.0090  | 60000  | 4.3074          | 99.0124 | 0.0283              | 0.0504           | 0.0347       | 0.0476                     | 0.0872                  | 0.0590              | 0.0641                   | 0.1169                | 0.0792            | 0.1600                   | 0.2986                | 0.1987            | 0.0                    | 0.0                 | 0.0             | 1.0                 | 1.0              | 0.8          | 0.0                 | 0.0              | 0.0          | 0.0                        | 0.0                     | 0.0                 | 0.0                      | 0.0                   | 0.0               | 0.0                      | 0.0                   | 0.0               |
| 0.0001        | 12.0120 | 80000  | 4.6534          | 98.8315 | 0.0295              | 0.0516           | 0.0361       | 0.0502                     | 0.0882                  | 0.0611              | 0.0687                   | 0.1192                | 0.0826            | 0.1715                   | 0.3112                | 0.2103            | 0.0                    | 0.0                 | 0.0             | 1.0                 | 1.0              | 1.0          | 0.0                 | 0.0              | 0.0          | 0.0                        | 0.0                     | 0.0                 | 0.0                      | 0.0                   | 0.0               | 0.0                      | 0.0                   | 0.0               |
| 0.0           | 15.0150 | 100000 | 4.7623          | 98.8781 | 0.0291              | 0.0505           | 0.0356       | 0.0493                     | 0.0876                  | 0.0605              | 0.0685                   | 0.1200                | 0.0833            | 0.1707                   | 0.3059                | 0.2088            | 0.0                    | 0.0                 | 0.0             | 0.4444              | 1.0              | 0.4286       | 0.0                 | 0.0              | 0.0          | 0.0                        | 0.0                     | 0.0                 | 0.0                      | 0.0                   | 0.0               | 0.0                      | 0.0                   | 0.0               |


### Framework versions

- Transformers 4.49.0
- Pytorch 2.6.0+cu126
- Datasets 2.12.0
- Tokenizers 0.20.1