cantillation commited on
Commit
4c63924
·
verified ·
1 Parent(s): 546f5b7

End of training

Browse files
README.md CHANGED
@@ -20,20 +20,20 @@ should probably proofread and complete it, then remove this comment. -->
20
 
21
  This model is a fine-tuned version of [openai/whisper-medium](https://huggingface.co/openai/whisper-medium) on an unknown dataset.
22
  It achieves the following results on the evaluation set:
23
- - Loss: 0.1692
24
- - Wer: 21.1641
25
- - Avg Precision Exact: 0.8062
26
- - Avg Recall Exact: 0.8080
27
- - Avg F1 Exact: 0.8065
28
- - Avg Precision Letter Shift: 0.8352
29
- - Avg Recall Letter Shift: 0.8370
30
- - Avg F1 Letter Shift: 0.8355
31
- - Avg Precision Word Level: 0.8409
32
- - Avg Recall Word Level: 0.8430
33
- - Avg F1 Word Level: 0.8413
34
- - Avg Precision Word Shift: 0.9407
35
- - Avg Recall Word Shift: 0.9446
36
- - Avg F1 Word Shift: 0.9419
37
  - Precision Median Exact: 0.9091
38
  - Recall Median Exact: 0.9091
39
  - F1 Median Exact: 0.9091
@@ -77,28 +77,33 @@ The following hyperparameters were used during training:
77
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
78
  - lr_scheduler_type: linear
79
  - lr_scheduler_warmup_steps: 100
80
- - training_steps: 30000
81
  - mixed_precision_training: Native AMP
82
 
83
  ### Training results
84
 
85
  | Training Loss | Epoch | Step | Validation Loss | Wer | Avg Precision Exact | Avg Recall Exact | Avg F1 Exact | Avg Precision Letter Shift | Avg Recall Letter Shift | Avg F1 Letter Shift | Avg Precision Word Level | Avg Recall Word Level | Avg F1 Word Level | Avg Precision Word Shift | Avg Recall Word Shift | Avg F1 Word Shift | Precision Median Exact | Recall Median Exact | F1 Median Exact | Precision Max Exact | Recall Max Exact | F1 Max Exact | Precision Min Exact | Recall Min Exact | F1 Min Exact | Precision Min Letter Shift | Recall Min Letter Shift | F1 Min Letter Shift | Precision Min Word Level | Recall Min Word Level | F1 Min Word Level | Precision Min Word Shift | Recall Min Word Shift | F1 Min Word Shift |
86
  |:-------------:|:-----:|:-----:|:---------------:|:-------:|:-------------------:|:----------------:|:------------:|:--------------------------:|:-----------------------:|:-------------------:|:------------------------:|:---------------------:|:-----------------:|:------------------------:|:---------------------:|:-----------------:|:----------------------:|:-------------------:|:---------------:|:-------------------:|:----------------:|:------------:|:-------------------:|:----------------:|:------------:|:--------------------------:|:-----------------------:|:-------------------:|:------------------------:|:---------------------:|:-----------------:|:------------------------:|:---------------------:|:-----------------:|
87
- | 0.2936 | 0.16 | 2000 | 0.3109 | 44.8004 | 0.5576 | 0.5646 | 0.5602 | 0.6021 | 0.6096 | 0.6049 | 0.6142 | 0.6239 | 0.6181 | 0.8035 | 0.8191 | 0.8099 | 0.6154 | 0.625 | 0.6207 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
88
- | 0.1696 | 0.32 | 4000 | 0.2363 | 35.3622 | 0.6570 | 0.6509 | 0.6531 | 0.6965 | 0.6898 | 0.6923 | 0.7059 | 0.7000 | 0.7021 | 0.8774 | 0.8761 | 0.8755 | 0.75 | 0.75 | 0.75 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
89
- | 0.1147 | 0.48 | 6000 | 0.2059 | 30.2846 | 0.7069 | 0.7015 | 0.7035 | 0.7422 | 0.7364 | 0.7385 | 0.7512 | 0.7461 | 0.7478 | 0.8992 | 0.8989 | 0.8980 | 0.8182 | 0.8 | 0.8000 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1429 | 0.1 | 0.1176 |
90
- | 0.101 | 0.64 | 8000 | 0.1887 | 27.1175 | 0.7477 | 0.7487 | 0.7475 | 0.7812 | 0.7822 | 0.7810 | 0.7884 | 0.7898 | 0.7884 | 0.9171 | 0.9216 | 0.9183 | 0.8462 | 0.8571 | 0.8571 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
91
- | 0.0625 | 0.8 | 10000 | 0.1856 | 25.9313 | 0.7457 | 0.7570 | 0.7506 | 0.7776 | 0.7894 | 0.7827 | 0.7838 | 0.7952 | 0.7888 | 0.9171 | 0.9287 | 0.9219 | 0.8571 | 0.875 | 0.8571 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1429 | 0.1 | 0.1176 |
92
- | 0.0744 | 0.96 | 12000 | 0.1771 | 24.8226 | 0.7654 | 0.7722 | 0.7681 | 0.7948 | 0.8019 | 0.7976 | 0.8016 | 0.8085 | 0.8043 | 0.9189 | 0.9265 | 0.9218 | 0.8667 | 0.875 | 0.8696 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0769 | 0.0909 | 0.0833 |
93
- | 0.0596 | 1.12 | 14000 | 0.1725 | 23.7103 | 0.7773 | 0.7794 | 0.7777 | 0.8077 | 0.8100 | 0.8082 | 0.8148 | 0.8169 | 0.8151 | 0.9306 | 0.9334 | 0.9311 | 0.8889 | 0.8889 | 0.8889 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1429 | 0.1 | 0.1176 |
94
- | 0.0515 | 1.28 | 16000 | 0.1696 | 22.9305 | 0.7880 | 0.7897 | 0.7883 | 0.8183 | 0.8200 | 0.8185 | 0.8242 | 0.8261 | 0.8245 | 0.9352 | 0.9384 | 0.9360 | 0.9 | 0.9 | 0.8966 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1111 | 0.1 | 0.1176 |
95
- | 0.0369 | 1.44 | 18000 | 0.1695 | 22.4390 | 0.7937 | 0.7924 | 0.7925 | 0.8239 | 0.8226 | 0.8226 | 0.8297 | 0.8289 | 0.8286 | 0.9370 | 0.9391 | 0.9372 | 0.9 | 0.9 | 0.9 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
96
- | 0.035 | 1.6 | 20000 | 0.1699 | 22.2358 | 0.7934 | 0.7948 | 0.7935 | 0.8227 | 0.8242 | 0.8228 | 0.8288 | 0.8303 | 0.8289 | 0.9380 | 0.9412 | 0.9388 | 0.9 | 0.9 | 0.9 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
97
- | 0.0282 | 1.76 | 22000 | 0.1686 | 21.9549 | 0.7956 | 0.7940 | 0.7942 | 0.8258 | 0.8241 | 0.8243 | 0.8314 | 0.8298 | 0.8300 | 0.9408 | 0.9412 | 0.9402 | 0.9 | 0.9 | 0.9 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1111 | 0.1 | 0.1176 |
98
- | 0.0215 | 1.92 | 24000 | 0.1688 | 21.6445 | 0.8002 | 0.8022 | 0.8006 | 0.8287 | 0.8307 | 0.8291 | 0.8341 | 0.8363 | 0.8346 | 0.9380 | 0.9417 | 0.9391 | 0.9 | 0.9091 | 0.9 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1429 | 0.1 | 0.1176 |
99
- | 0.024 | 2.08 | 26000 | 0.1699 | 21.1899 | 0.8037 | 0.8070 | 0.8047 | 0.8331 | 0.8365 | 0.8342 | 0.8389 | 0.8424 | 0.8400 | 0.9415 | 0.9468 | 0.9434 | 0.9091 | 0.9091 | 0.9032 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1429 | 0.1 | 0.1176 |
100
- | 0.0198 | 2.24 | 28000 | 0.1696 | 21.3119 | 0.8038 | 0.8070 | 0.8048 | 0.8327 | 0.8360 | 0.8337 | 0.8382 | 0.8418 | 0.8393 | 0.9390 | 0.9445 | 0.9409 | 0.9091 | 0.9091 | 0.9032 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1429 | 0.1 | 0.1176 |
101
- | 0.0219 | 2.4 | 30000 | 0.1692 | 21.1641 | 0.8062 | 0.8080 | 0.8065 | 0.8352 | 0.8370 | 0.8355 | 0.8409 | 0.8430 | 0.8413 | 0.9407 | 0.9446 | 0.9419 | 0.9091 | 0.9091 | 0.9091 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1429 | 0.1 | 0.1176 |
 
 
 
 
 
102
 
103
 
104
  ### Framework versions
 
20
 
21
  This model is a fine-tuned version of [openai/whisper-medium](https://huggingface.co/openai/whisper-medium) on an unknown dataset.
22
  It achieves the following results on the evaluation set:
23
+ - Loss: 0.1756
24
+ - Wer: 20.1811
25
+ - Avg Precision Exact: 0.8083
26
+ - Avg Recall Exact: 0.8102
27
+ - Avg F1 Exact: 0.8087
28
+ - Avg Precision Letter Shift: 0.8373
29
+ - Avg Recall Letter Shift: 0.8394
30
+ - Avg F1 Letter Shift: 0.8377
31
+ - Avg Precision Word Level: 0.8427
32
+ - Avg Recall Word Level: 0.8450
33
+ - Avg F1 Word Level: 0.8432
34
+ - Avg Precision Word Shift: 0.9448
35
+ - Avg Recall Word Shift: 0.9489
36
+ - Avg F1 Word Shift: 0.9460
37
  - Precision Median Exact: 0.9091
38
  - Recall Median Exact: 0.9091
39
  - F1 Median Exact: 0.9091
 
77
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
78
  - lr_scheduler_type: linear
79
  - lr_scheduler_warmup_steps: 100
80
+ - training_steps: 40000
81
  - mixed_precision_training: Native AMP
82
 
83
  ### Training results
84
 
85
  | Training Loss | Epoch | Step | Validation Loss | Wer | Avg Precision Exact | Avg Recall Exact | Avg F1 Exact | Avg Precision Letter Shift | Avg Recall Letter Shift | Avg F1 Letter Shift | Avg Precision Word Level | Avg Recall Word Level | Avg F1 Word Level | Avg Precision Word Shift | Avg Recall Word Shift | Avg F1 Word Shift | Precision Median Exact | Recall Median Exact | F1 Median Exact | Precision Max Exact | Recall Max Exact | F1 Max Exact | Precision Min Exact | Recall Min Exact | F1 Min Exact | Precision Min Letter Shift | Recall Min Letter Shift | F1 Min Letter Shift | Precision Min Word Level | Recall Min Word Level | F1 Min Word Level | Precision Min Word Shift | Recall Min Word Shift | F1 Min Word Shift |
86
  |:-------------:|:-----:|:-----:|:---------------:|:-------:|:-------------------:|:----------------:|:------------:|:--------------------------:|:-----------------------:|:-------------------:|:------------------------:|:---------------------:|:-----------------:|:------------------------:|:---------------------:|:-----------------:|:----------------------:|:-------------------:|:---------------:|:-------------------:|:----------------:|:------------:|:-------------------:|:----------------:|:------------:|:--------------------------:|:-----------------------:|:-------------------:|:------------------------:|:---------------------:|:-----------------:|:------------------------:|:---------------------:|:-----------------:|
87
+ | 0.2873 | 0.16 | 2000 | 0.3088 | 44.3311 | 0.5598 | 0.5685 | 0.5633 | 0.6026 | 0.6118 | 0.6062 | 0.6138 | 0.6239 | 0.6178 | 0.8019 | 0.8196 | 0.8092 | 0.6154 | 0.625 | 0.6207 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
88
+ | 0.1752 | 0.32 | 4000 | 0.2328 | 35.1811 | 0.6557 | 0.6595 | 0.6568 | 0.6946 | 0.6985 | 0.6957 | 0.7041 | 0.7082 | 0.7053 | 0.8676 | 0.8745 | 0.8698 | 0.75 | 0.75 | 0.75 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
89
+ | 0.117 | 0.48 | 6000 | 0.1997 | 29.4605 | 0.7124 | 0.7125 | 0.7117 | 0.7514 | 0.7513 | 0.7506 | 0.7604 | 0.7606 | 0.7597 | 0.9031 | 0.9063 | 0.9037 | 0.8182 | 0.8182 | 0.8148 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1111 | 0.1 | 0.1053 |
90
+ | 0.0994 | 0.64 | 8000 | 0.1881 | 27.5610 | 0.7359 | 0.7407 | 0.7376 | 0.7708 | 0.7758 | 0.7726 | 0.7783 | 0.7837 | 0.7803 | 0.9117 | 0.9191 | 0.9144 | 0.8333 | 0.8462 | 0.8387 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1429 | 0.0909 | 0.1111 |
91
+ | 0.0664 | 0.8 | 10000 | 0.1837 | 25.9682 | 0.7446 | 0.7529 | 0.7480 | 0.7785 | 0.7873 | 0.7821 | 0.7857 | 0.7944 | 0.7893 | 0.9194 | 0.9277 | 0.9226 | 0.8462 | 0.8571 | 0.8571 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1111 | 0.0909 | 0.1111 |
92
+ | 0.0767 | 0.96 | 12000 | 0.1760 | 24.6157 | 0.7561 | 0.7662 | 0.7604 | 0.7868 | 0.7973 | 0.7913 | 0.7936 | 0.8040 | 0.7980 | 0.9194 | 0.9315 | 0.9245 | 0.8667 | 0.875 | 0.8723 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
93
+ | 0.0593 | 1.12 | 14000 | 0.1716 | 23.6511 | 0.7669 | 0.7732 | 0.7694 | 0.7988 | 0.8054 | 0.8014 | 0.8047 | 0.8114 | 0.8073 | 0.9275 | 0.9343 | 0.9300 | 0.875 | 0.8889 | 0.8800 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1111 | 0.1111 | 0.1111 |
94
+ | 0.0525 | 1.28 | 16000 | 0.1712 | 23.1264 | 0.7778 | 0.7788 | 0.7777 | 0.8092 | 0.8104 | 0.8092 | 0.8156 | 0.8169 | 0.8156 | 0.9345 | 0.9383 | 0.9356 | 0.8889 | 0.8889 | 0.8889 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1111 | 0.1111 | 0.125 |
95
+ | 0.0358 | 1.44 | 18000 | 0.1699 | 22.4538 | 0.7841 | 0.7845 | 0.7837 | 0.8150 | 0.8157 | 0.8147 | 0.8212 | 0.8222 | 0.8211 | 0.9344 | 0.9376 | 0.9351 | 0.8889 | 0.8889 | 0.8889 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1 | 0.1 | 0.1053 |
96
+ | 0.0323 | 1.6 | 20000 | 0.1713 | 22.2173 | 0.7873 | 0.7926 | 0.7893 | 0.8170 | 0.8224 | 0.8190 | 0.8230 | 0.8286 | 0.8251 | 0.9362 | 0.9424 | 0.9385 | 0.9 | 0.9 | 0.9 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1111 | 0.1 | 0.1111 |
97
+ | 0.0248 | 1.76 | 22000 | 0.1683 | 21.7480 | 0.7934 | 0.7945 | 0.7933 | 0.8235 | 0.8248 | 0.8235 | 0.8294 | 0.8310 | 0.8295 | 0.9407 | 0.9433 | 0.9412 | 0.9 | 0.9 | 0.9 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1429 | 0.1 | 0.1176 |
98
+ | 0.0209 | 1.92 | 24000 | 0.1696 | 21.0310 | 0.7982 | 0.8000 | 0.7986 | 0.8275 | 0.8292 | 0.8278 | 0.8331 | 0.8351 | 0.8335 | 0.9424 | 0.9461 | 0.9435 | 0.9 | 0.9 | 0.9 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1429 | 0.1 | 0.1176 |
99
+ | 0.0202 | 2.08 | 26000 | 0.1713 | 21.1936 | 0.7954 | 0.7988 | 0.7965 | 0.8250 | 0.8285 | 0.8262 | 0.8309 | 0.8346 | 0.8321 | 0.9404 | 0.9460 | 0.9424 | 0.9 | 0.9091 | 0.9 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0769 | 0.0909 | 0.0833 |
100
+ | 0.0172 | 2.24 | 28000 | 0.1716 | 20.7761 | 0.8013 | 0.8053 | 0.8027 | 0.8304 | 0.8346 | 0.8319 | 0.8359 | 0.8407 | 0.8376 | 0.9404 | 0.9469 | 0.9428 | 0.9091 | 0.9091 | 0.9091 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2 | 0.1818 | 0.2000 |
101
+ | 0.0161 | 2.4 | 30000 | 0.1740 | 20.6135 | 0.8052 | 0.8079 | 0.8059 | 0.8351 | 0.8380 | 0.8359 | 0.8408 | 0.8440 | 0.8417 | 0.9439 | 0.9494 | 0.9459 | 0.9091 | 0.9091 | 0.9091 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.125 | 0.1 | 0.1176 |
102
+ | 0.01 | 2.56 | 32000 | 0.1743 | 20.6948 | 0.8031 | 0.8048 | 0.8033 | 0.8322 | 0.8339 | 0.8323 | 0.8380 | 0.8399 | 0.8382 | 0.9441 | 0.9480 | 0.9452 | 0.9091 | 0.9091 | 0.9062 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1429 | 0.1 | 0.1176 |
103
+ | 0.025 | 2.72 | 34000 | 0.1753 | 20.6282 | 0.8033 | 0.8072 | 0.8046 | 0.8327 | 0.8368 | 0.8341 | 0.8383 | 0.8430 | 0.8400 | 0.9419 | 0.9489 | 0.9446 | 0.9091 | 0.9091 | 0.9091 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1429 | 0.1 | 0.1176 |
104
+ | 0.0082 | 2.88 | 36000 | 0.1756 | 20.3991 | 0.8060 | 0.8081 | 0.8064 | 0.8354 | 0.8378 | 0.8359 | 0.8406 | 0.8433 | 0.8412 | 0.9436 | 0.9484 | 0.9452 | 0.9091 | 0.9091 | 0.9091 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1429 | 0.1 | 0.1176 |
105
+ | 0.013 | 3.04 | 38000 | 0.1754 | 20.3030 | 0.8078 | 0.8097 | 0.8082 | 0.8374 | 0.8395 | 0.8378 | 0.8427 | 0.8452 | 0.8433 | 0.9447 | 0.9488 | 0.9459 | 0.9091 | 0.9091 | 0.9091 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1429 | 0.1 | 0.1176 |
106
+ | 0.0183 | 3.2 | 40000 | 0.1756 | 20.1811 | 0.8083 | 0.8102 | 0.8087 | 0.8373 | 0.8394 | 0.8377 | 0.8427 | 0.8450 | 0.8432 | 0.9448 | 0.9489 | 0.9460 | 0.9091 | 0.9091 | 0.9091 | 1.0 | 1.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1429 | 0.1 | 0.1176 |
107
 
108
 
109
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:9ea80ab1f03fb56d2f202288ff2b1a727fbad947dd555dfd8a5c42236d3fccd6
3
  size 151109288
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:bdff08efeece16803895ace3376a1383be06e17657519525de7fe805f92c5481
3
  size 151109288
runs/Mar14_11-19-36_sipl-7542-ct/events.out.tfevents.1710415177.sipl-7542-ct.636051.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:8a3363022a81e5cdff29c0bff14c68949082c3a3dea46f244a98dbf98b16d604
3
- size 372274
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b65fd76859d8e296e12d8558eb0eb0034a44d2c97a5d1deb0fb4854c85807de7
3
+ size 392089