End of training
Browse files- README.md +4 -68
- added_tokens.json +32 -1
- config.json +4 -4
- model.safetensors +2 -2
- runs/Feb14_22-06-58_sipl-7542-ct/events.out.tfevents.1707948422.sipl-7542-ct.1169959.0 +3 -0
- runs/Feb14_22-17-43_sipl-7542-ct/events.out.tfevents.1707949067.sipl-7542-ct.1173100.0 +3 -0
- runs/Feb14_22-33-54_sipl-7542-ct/events.out.tfevents.1707950037.sipl-7542-ct.1177613.0 +3 -0
- runs/Feb14_22-40-50_sipl-7542-ct/events.out.tfevents.1707950461.sipl-7542-ct.1177613.1 +3 -0
- runs/Feb14_22-41-58_sipl-7542-ct/events.out.tfevents.1707950520.sipl-7542-ct.1177613.2 +3 -0
- runs/Feb14_22-51-17_sipl-7542-ct/events.out.tfevents.1707951078.sipl-7542-ct.1177613.3 +3 -0
- tokenizer_config.json +248 -0
- training_args.bin +1 -1
README.md
CHANGED
@@ -6,10 +6,6 @@ base_model: openai/whisper-medium
|
|
6 |
tags:
|
7 |
- hf-asr-leaderboard
|
8 |
- generated_from_trainer
|
9 |
-
metrics:
|
10 |
-
- precision
|
11 |
-
- recall
|
12 |
-
- f1
|
13 |
model-index:
|
14 |
- name: he
|
15 |
results: []
|
@@ -21,20 +17,6 @@ should probably proofread and complete it, then remove this comment. -->
|
|
21 |
# he
|
22 |
|
23 |
This model is a fine-tuned version of [openai/whisper-medium](https://huggingface.co/openai/whisper-medium) on an unknown dataset.
|
24 |
-
It achieves the following results on the evaluation set:
|
25 |
-
- Loss: 1.6591
|
26 |
-
- Precision: 0.0398
|
27 |
-
- Recall: 0.0391
|
28 |
-
- F1: 0.0391
|
29 |
-
- Precision Median: 0.0
|
30 |
-
- Recall Median: 0.0
|
31 |
-
- F1 Median: 0.0
|
32 |
-
- Precision Max: 0.2
|
33 |
-
- Recall Max: 0.2
|
34 |
-
- F1 Max: 0.2000
|
35 |
-
- Precision Min: 0.0
|
36 |
-
- Recall Min: 0.0
|
37 |
-
- F1 Min: 0.0
|
38 |
|
39 |
## Model description
|
40 |
|
@@ -53,62 +35,16 @@ More information needed
|
|
53 |
### Training hyperparameters
|
54 |
|
55 |
The following hyperparameters were used during training:
|
56 |
-
- learning_rate:
|
57 |
-
- train_batch_size:
|
58 |
- eval_batch_size: 8
|
59 |
- seed: 42
|
60 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
61 |
- lr_scheduler_type: linear
|
62 |
-
- lr_scheduler_warmup_steps:
|
63 |
-
- training_steps:
|
64 |
- mixed_precision_training: Native AMP
|
65 |
|
66 |
-
### Training results
|
67 |
-
|
68 |
-
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Precision Median | Recall Median | F1 Median | Precision Max | Recall Max | F1 Max | Precision Min | Recall Min | F1 Min |
|
69 |
-
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:----------------:|:-------------:|:---------:|:-------------:|:----------:|:------:|:-------------:|:----------:|:------:|
|
70 |
-
| 7.5196 | 0.01 | 25 | 3.8632 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0 | 0 | 0 | 0 | 0 | 0 |
|
71 |
-
| 3.5602 | 0.02 | 50 | 3.2631 | 0.0160 | 0.0368 | 0.0222 | 0.0303 | 0.0556 | 0.04 | 0.0312 | 0.0909 | 0.0455 | 0.0 | 0.0 | 0.0 |
|
72 |
-
| 3.1184 | 0.03 | 75 | 3.0632 | 0.0044 | 0.0090 | 0.0059 | 0.0 | 0.0 | 0.0 | 0.0417 | 0.1 | 0.0588 | 0.0 | 0.0 | 0.0 |
|
73 |
-
| 2.9519 | 0.04 | 100 | 3.0036 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0 | 0 | 0 | 0 | 0 | 0 |
|
74 |
-
| 2.8338 | 0.05 | 125 | 2.7501 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0 | 0 | 0 | 0 | 0 | 0 |
|
75 |
-
| 2.6873 | 0.06 | 150 | 2.6214 | 0.0758 | 0.0563 | 0.0643 | 0.0909 | 0.0625 | 0.0714 | 0.2727 | 0.2 | 0.2308 | 0.0 | 0.0 | 0.0 |
|
76 |
-
| 2.5495 | 0.07 | 175 | 2.4574 | 0.0310 | 0.0374 | 0.0335 | 0.0 | 0.0 | 0.0 | 0.1176 | 0.2 | 0.1481 | 0.0 | 0.0 | 0.0 |
|
77 |
-
| 2.4591 | 0.08 | 200 | 2.4156 | 0.0376 | 0.0571 | 0.0449 | 0.0476 | 0.0625 | 0.0541 | 0.0952 | 0.2 | 0.1290 | 0.0 | 0.0 | 0.0 |
|
78 |
-
| 2.3637 | 0.09 | 225 | 2.4155 | 0.0197 | 0.0192 | 0.0194 | 0.0 | 0.0 | 0.0 | 0.125 | 0.125 | 0.125 | 0.0 | 0.0 | 0.0 |
|
79 |
-
| 2.3093 | 0.1 | 250 | 2.2890 | 0.0395 | 0.0434 | 0.0410 | 0.0 | 0.0 | 0.0 | 0.125 | 0.1818 | 0.1481 | 0.0 | 0.0 | 0.0 |
|
80 |
-
| 2.2333 | 0.11 | 275 | 2.3871 | 0.0817 | 0.0815 | 0.0813 | 0.0625 | 0.0714 | 0.0645 | 0.1875 | 0.1765 | 0.1818 | 0.0 | 0.0 | 0.0 |
|
81 |
-
| 2.1783 | 0.12 | 300 | 2.2399 | 0.0188 | 0.0169 | 0.0176 | 0.0 | 0.0 | 0.0 | 0.1429 | 0.1176 | 0.1290 | 0.0 | 0.0 | 0.0 |
|
82 |
-
| 2.1766 | 0.13 | 325 | 2.1678 | 0.1201 | 0.1392 | 0.1284 | 0.1111 | 0.1333 | 0.1212 | 0.2778 | 0.2941 | 0.2857 | 0.0 | 0.0 | 0.0 |
|
83 |
-
| 2.0944 | 0.14 | 350 | 2.2700 | 0.0088 | 0.0062 | 0.0073 | 0.0 | 0.0 | 0.0 | 0.0833 | 0.0625 | 0.0714 | 0.0 | 0.0 | 0.0 |
|
84 |
-
| 2.0261 | 0.15 | 375 | 2.1503 | 0.0386 | 0.0381 | 0.0382 | 0.0 | 0.0 | 0.0 | 0.1333 | 0.125 | 0.1290 | 0.0 | 0.0 | 0.0 |
|
85 |
-
| 1.9888 | 0.16 | 400 | 2.1727 | 0.0117 | 0.0121 | 0.0119 | 0.0 | 0.0 | 0.0 | 0.0556 | 0.0625 | 0.0588 | 0.0 | 0.0 | 0.0 |
|
86 |
-
| 1.955 | 0.17 | 425 | 2.1425 | 0.0795 | 0.0947 | 0.0861 | 0.1053 | 0.1176 | 0.1143 | 0.1579 | 0.2 | 0.1765 | 0.0 | 0.0 | 0.0 |
|
87 |
-
| 1.9361 | 0.18 | 450 | 2.0521 | 0.0489 | 0.0595 | 0.0533 | 0.0526 | 0.0588 | 0.0556 | 0.1333 | 0.1667 | 0.1290 | 0.0 | 0.0 | 0.0 |
|
88 |
-
| 1.8902 | 0.19 | 475 | 2.0196 | 0.0066 | 0.0060 | 0.0063 | 0.0 | 0.0 | 0.0 | 0.0667 | 0.0588 | 0.0606 | 0.0 | 0.0 | 0.0 |
|
89 |
-
| 1.8809 | 0.2 | 500 | 2.0199 | 0.0259 | 0.0293 | 0.0274 | 0.0 | 0.0 | 0.0 | 0.1111 | 0.125 | 0.1176 | 0.0 | 0.0 | 0.0 |
|
90 |
-
| 1.837 | 0.21 | 525 | 1.9694 | 0.0066 | 0.0067 | 0.0066 | 0.0 | 0.0 | 0.0 | 0.0667 | 0.0714 | 0.0645 | 0.0 | 0.0 | 0.0 |
|
91 |
-
| 1.8329 | 0.22 | 550 | 1.9790 | 0.0095 | 0.0106 | 0.0100 | 0.0 | 0.0 | 0.0 | 0.0625 | 0.0714 | 0.0645 | 0.0 | 0.0 | 0.0 |
|
92 |
-
| 1.8233 | 0.23 | 575 | 1.8902 | 0.0930 | 0.1220 | 0.1049 | 0.0625 | 0.0833 | 0.0714 | 0.25 | 0.3333 | 0.2703 | 0.0 | 0.0 | 0.0 |
|
93 |
-
| 1.7814 | 0.24 | 600 | 1.8473 | 0.0785 | 0.0939 | 0.0851 | 0.0588 | 0.0833 | 0.0690 | 0.2778 | 0.2941 | 0.2857 | 0.0 | 0.0 | 0.0 |
|
94 |
-
| 1.7461 | 0.25 | 625 | 1.7672 | 0.0799 | 0.0924 | 0.0852 | 0.0588 | 0.0909 | 0.0714 | 0.1667 | 0.2143 | 0.1875 | 0.0 | 0.0 | 0.0 |
|
95 |
-
| 1.7398 | 0.26 | 650 | 1.8822 | 0.0837 | 0.0902 | 0.0863 | 0.0625 | 0.0625 | 0.0625 | 0.3333 | 0.3125 | 0.3226 | 0.0 | 0.0 | 0.0 |
|
96 |
-
| 1.7282 | 0.27 | 675 | 1.7915 | 0.0539 | 0.0704 | 0.0605 | 0.0455 | 0.0625 | 0.0526 | 0.1364 | 0.2 | 0.1622 | 0.0 | 0.0 | 0.0 |
|
97 |
-
| 1.6463 | 0.28 | 700 | 1.8641 | 0.0292 | 0.0305 | 0.0298 | 0.0 | 0.0 | 0.0 | 0.125 | 0.1333 | 0.1290 | 0.0 | 0.0 | 0.0 |
|
98 |
-
| 1.6577 | 0.29 | 725 | 1.7592 | 0.0803 | 0.0886 | 0.0838 | 0.0588 | 0.0714 | 0.0588 | 0.2353 | 0.2353 | 0.2353 | 0.0 | 0.0 | 0.0 |
|
99 |
-
| 1.6675 | 0.3 | 750 | 1.7720 | 0.0316 | 0.0426 | 0.0359 | 0.0 | 0.0 | 0.0 | 0.1429 | 0.2143 | 0.1714 | 0.0 | 0.0 | 0.0 |
|
100 |
-
| 1.6416 | 0.31 | 775 | 1.7664 | 0.0560 | 0.0674 | 0.0606 | 0.05 | 0.0625 | 0.0588 | 0.1667 | 0.1818 | 0.1714 | 0.0 | 0.0 | 0.0 |
|
101 |
-
| 1.6431 | 0.32 | 800 | 1.7466 | 0.0778 | 0.1102 | 0.0893 | 0.0714 | 0.0909 | 0.0714 | 0.25 | 0.3636 | 0.2778 | 0.0 | 0.0 | 0.0 |
|
102 |
-
| 1.5706 | 0.33 | 825 | 1.7675 | 0.0603 | 0.0646 | 0.0621 | 0.0526 | 0.0625 | 0.0588 | 0.1579 | 0.1875 | 0.1714 | 0.0 | 0.0 | 0.0 |
|
103 |
-
| 1.5303 | 0.34 | 850 | 1.7940 | 0.0133 | 0.0162 | 0.0146 | 0.0 | 0.0 | 0.0 | 0.0556 | 0.0714 | 0.0588 | 0.0 | 0.0 | 0.0 |
|
104 |
-
| 1.5507 | 0.35 | 875 | 1.7630 | 0.0371 | 0.0306 | 0.0334 | 0.0 | 0.0 | 0.0 | 0.1667 | 0.1429 | 0.1538 | 0.0 | 0.0 | 0.0 |
|
105 |
-
| 1.5245 | 0.36 | 900 | 1.7160 | 0.0526 | 0.0409 | 0.0456 | 0.0769 | 0.05 | 0.0645 | 0.2 | 0.1818 | 0.1905 | 0.0 | 0.0 | 0.0 |
|
106 |
-
| 1.5263 | 0.37 | 925 | 1.6858 | 0.0418 | 0.0429 | 0.0420 | 0.0 | 0.0 | 0.0 | 0.1333 | 0.1538 | 0.1333 | 0.0 | 0.0 | 0.0 |
|
107 |
-
| 1.517 | 0.38 | 950 | 1.6739 | 0.0360 | 0.0380 | 0.0368 | 0.0 | 0.0 | 0.0 | 0.2 | 0.2 | 0.2000 | 0.0 | 0.0 | 0.0 |
|
108 |
-
| 1.488 | 0.39 | 975 | 1.6761 | 0.0395 | 0.0461 | 0.0425 | 0.05 | 0.0556 | 0.0526 | 0.1176 | 0.1429 | 0.1290 | 0.0 | 0.0 | 0.0 |
|
109 |
-
| 1.4926 | 0.4 | 1000 | 1.6591 | 0.0398 | 0.0391 | 0.0391 | 0.0 | 0.0 | 0.0 | 0.2 | 0.2 | 0.2000 | 0.0 | 0.0 | 0.0 |
|
110 |
-
|
111 |
-
|
112 |
### Framework versions
|
113 |
|
114 |
- Transformers 4.36.2
|
|
|
6 |
tags:
|
7 |
- hf-asr-leaderboard
|
8 |
- generated_from_trainer
|
|
|
|
|
|
|
|
|
9 |
model-index:
|
10 |
- name: he
|
11 |
results: []
|
|
|
17 |
# he
|
18 |
|
19 |
This model is a fine-tuned version of [openai/whisper-medium](https://huggingface.co/openai/whisper-medium) on an unknown dataset.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
20 |
|
21 |
## Model description
|
22 |
|
|
|
35 |
### Training hyperparameters
|
36 |
|
37 |
The following hyperparameters were used during training:
|
38 |
+
- learning_rate: 8e-05
|
39 |
+
- train_batch_size: 12
|
40 |
- eval_batch_size: 8
|
41 |
- seed: 42
|
42 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
43 |
- lr_scheduler_type: linear
|
44 |
+
- lr_scheduler_warmup_steps: 20
|
45 |
+
- training_steps: 10000
|
46 |
- mixed_precision_training: Native AMP
|
47 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
48 |
### Framework versions
|
49 |
|
50 |
- Transformers 4.36.2
|
added_tokens.json
CHANGED
@@ -1605,5 +1605,36 @@
|
|
1605 |
"<|vi|>": 50278,
|
1606 |
"<|yi|>": 50335,
|
1607 |
"<|yo|>": 50325,
|
1608 |
-
"<|zh|>": 50260
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1609 |
}
|
|
|
1605 |
"<|vi|>": 50278,
|
1606 |
"<|yi|>": 50335,
|
1607 |
"<|yo|>": 50325,
|
1608 |
+
"<|zh|>": 50260,
|
1609 |
+
"֑": 51865,
|
1610 |
+
"֒": 51866,
|
1611 |
+
"֓": 51867,
|
1612 |
+
"֔": 51868,
|
1613 |
+
"֕": 51869,
|
1614 |
+
"֖": 51870,
|
1615 |
+
"֗": 51871,
|
1616 |
+
"֘": 51872,
|
1617 |
+
"֙": 51873,
|
1618 |
+
"֚": 51874,
|
1619 |
+
"֛": 51875,
|
1620 |
+
"֜": 51876,
|
1621 |
+
"֝": 51877,
|
1622 |
+
"֞": 51878,
|
1623 |
+
"֟": 51879,
|
1624 |
+
"֠": 51880,
|
1625 |
+
"֡": 51881,
|
1626 |
+
"֢": 51882,
|
1627 |
+
"֣": 51883,
|
1628 |
+
"֤": 51884,
|
1629 |
+
"֥": 51885,
|
1630 |
+
"֦": 51886,
|
1631 |
+
"֧": 51887,
|
1632 |
+
"֨": 51888,
|
1633 |
+
"֩": 51889,
|
1634 |
+
"֪": 51890,
|
1635 |
+
"֫": 51891,
|
1636 |
+
"֬": 51892,
|
1637 |
+
"֭": 51893,
|
1638 |
+
"֮": 51894,
|
1639 |
+
"ֽ": 51895
|
1640 |
}
|
config.json
CHANGED
@@ -1,12 +1,12 @@
|
|
1 |
{
|
2 |
"_name_or_path": "openai/whisper-medium",
|
3 |
-
"activation_dropout": 0.
|
4 |
"activation_function": "gelu",
|
5 |
"apply_spec_augment": false,
|
6 |
"architectures": [
|
7 |
"WhisperForConditionalGeneration"
|
8 |
],
|
9 |
-
"attention_dropout": 0.
|
10 |
"begin_suppress_tokens": [
|
11 |
220,
|
12 |
50257
|
@@ -20,7 +20,7 @@
|
|
20 |
"decoder_layerdrop": 0.0,
|
21 |
"decoder_layers": 24,
|
22 |
"decoder_start_token_id": 50258,
|
23 |
-
"dropout": 0.
|
24 |
"encoder_attention_heads": 16,
|
25 |
"encoder_ffn_dim": 4096,
|
26 |
"encoder_layerdrop": 0.0,
|
@@ -49,5 +49,5 @@
|
|
49 |
"transformers_version": "4.36.2",
|
50 |
"use_cache": true,
|
51 |
"use_weighted_layer_sum": false,
|
52 |
-
"vocab_size":
|
53 |
}
|
|
|
1 |
{
|
2 |
"_name_or_path": "openai/whisper-medium",
|
3 |
+
"activation_dropout": 0.1,
|
4 |
"activation_function": "gelu",
|
5 |
"apply_spec_augment": false,
|
6 |
"architectures": [
|
7 |
"WhisperForConditionalGeneration"
|
8 |
],
|
9 |
+
"attention_dropout": 0.1,
|
10 |
"begin_suppress_tokens": [
|
11 |
220,
|
12 |
50257
|
|
|
20 |
"decoder_layerdrop": 0.0,
|
21 |
"decoder_layers": 24,
|
22 |
"decoder_start_token_id": 50258,
|
23 |
+
"dropout": 0.1,
|
24 |
"encoder_attention_heads": 16,
|
25 |
"encoder_ffn_dim": 4096,
|
26 |
"encoder_layerdrop": 0.0,
|
|
|
49 |
"transformers_version": "4.36.2",
|
50 |
"use_cache": true,
|
51 |
"use_weighted_layer_sum": false,
|
52 |
+
"vocab_size": 51896
|
53 |
}
|
model.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:8e5e0d51f4c4c74b2d63e48352b9a4e70f6e3885b81b8182dd047346af69a94a
|
3 |
+
size 3055671280
|
runs/Feb14_22-06-58_sipl-7542-ct/events.out.tfevents.1707948422.sipl-7542-ct.1169959.0
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:03e10cd1408fd748e9cb34010e743490a16cf7d56805c1286d38c19b0aed1a2e
|
3 |
+
size 11619
|
runs/Feb14_22-17-43_sipl-7542-ct/events.out.tfevents.1707949067.sipl-7542-ct.1173100.0
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:06312932365fd83e73cd8d430daa4aa94eb50c10fbe180d921ecf6b100319751
|
3 |
+
size 18299
|
runs/Feb14_22-33-54_sipl-7542-ct/events.out.tfevents.1707950037.sipl-7542-ct.1177613.0
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:46ccc168b970fb815f192b3b38a00ebc10b184b92502e4489fe206242e5b5524
|
3 |
+
size 12928
|
runs/Feb14_22-40-50_sipl-7542-ct/events.out.tfevents.1707950461.sipl-7542-ct.1177613.1
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:3f16b3f875748fd2b73bd255a8e257fadfd991e0cf7732d403e066c2e1356723
|
3 |
+
size 5616
|
runs/Feb14_22-41-58_sipl-7542-ct/events.out.tfevents.1707950520.sipl-7542-ct.1177613.2
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:ee60cb5d93f5f82aa2316f775c34b0ae9a7ee30d1ddfe652ce2cb8c4b600fe47
|
3 |
+
size 8300
|
runs/Feb14_22-51-17_sipl-7542-ct/events.out.tfevents.1707951078.sipl-7542-ct.1177613.3
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:bf353ad348cd23a9aef510b3185f2d96afe7de4faffd8539a15ed7b70de461d8
|
3 |
+
size 4136
|
tokenizer_config.json
CHANGED
@@ -12865,6 +12865,254 @@
|
|
12865 |
"rstrip": false,
|
12866 |
"single_word": false,
|
12867 |
"special": false
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
12868 |
}
|
12869 |
},
|
12870 |
"additional_special_tokens": [
|
|
|
12865 |
"rstrip": false,
|
12866 |
"single_word": false,
|
12867 |
"special": false
|
12868 |
+
},
|
12869 |
+
"51865": {
|
12870 |
+
"content": "֑",
|
12871 |
+
"lstrip": false,
|
12872 |
+
"normalized": true,
|
12873 |
+
"rstrip": false,
|
12874 |
+
"single_word": false,
|
12875 |
+
"special": false
|
12876 |
+
},
|
12877 |
+
"51866": {
|
12878 |
+
"content": "֒",
|
12879 |
+
"lstrip": false,
|
12880 |
+
"normalized": true,
|
12881 |
+
"rstrip": false,
|
12882 |
+
"single_word": false,
|
12883 |
+
"special": false
|
12884 |
+
},
|
12885 |
+
"51867": {
|
12886 |
+
"content": "֓",
|
12887 |
+
"lstrip": false,
|
12888 |
+
"normalized": true,
|
12889 |
+
"rstrip": false,
|
12890 |
+
"single_word": false,
|
12891 |
+
"special": false
|
12892 |
+
},
|
12893 |
+
"51868": {
|
12894 |
+
"content": "֔",
|
12895 |
+
"lstrip": false,
|
12896 |
+
"normalized": true,
|
12897 |
+
"rstrip": false,
|
12898 |
+
"single_word": false,
|
12899 |
+
"special": false
|
12900 |
+
},
|
12901 |
+
"51869": {
|
12902 |
+
"content": "֕",
|
12903 |
+
"lstrip": false,
|
12904 |
+
"normalized": true,
|
12905 |
+
"rstrip": false,
|
12906 |
+
"single_word": false,
|
12907 |
+
"special": false
|
12908 |
+
},
|
12909 |
+
"51870": {
|
12910 |
+
"content": "֖",
|
12911 |
+
"lstrip": false,
|
12912 |
+
"normalized": true,
|
12913 |
+
"rstrip": false,
|
12914 |
+
"single_word": false,
|
12915 |
+
"special": false
|
12916 |
+
},
|
12917 |
+
"51871": {
|
12918 |
+
"content": "֗",
|
12919 |
+
"lstrip": false,
|
12920 |
+
"normalized": true,
|
12921 |
+
"rstrip": false,
|
12922 |
+
"single_word": false,
|
12923 |
+
"special": false
|
12924 |
+
},
|
12925 |
+
"51872": {
|
12926 |
+
"content": "֘",
|
12927 |
+
"lstrip": false,
|
12928 |
+
"normalized": true,
|
12929 |
+
"rstrip": false,
|
12930 |
+
"single_word": false,
|
12931 |
+
"special": false
|
12932 |
+
},
|
12933 |
+
"51873": {
|
12934 |
+
"content": "֙",
|
12935 |
+
"lstrip": false,
|
12936 |
+
"normalized": true,
|
12937 |
+
"rstrip": false,
|
12938 |
+
"single_word": false,
|
12939 |
+
"special": false
|
12940 |
+
},
|
12941 |
+
"51874": {
|
12942 |
+
"content": "֚",
|
12943 |
+
"lstrip": false,
|
12944 |
+
"normalized": true,
|
12945 |
+
"rstrip": false,
|
12946 |
+
"single_word": false,
|
12947 |
+
"special": false
|
12948 |
+
},
|
12949 |
+
"51875": {
|
12950 |
+
"content": "֛",
|
12951 |
+
"lstrip": false,
|
12952 |
+
"normalized": true,
|
12953 |
+
"rstrip": false,
|
12954 |
+
"single_word": false,
|
12955 |
+
"special": false
|
12956 |
+
},
|
12957 |
+
"51876": {
|
12958 |
+
"content": "֜",
|
12959 |
+
"lstrip": false,
|
12960 |
+
"normalized": true,
|
12961 |
+
"rstrip": false,
|
12962 |
+
"single_word": false,
|
12963 |
+
"special": false
|
12964 |
+
},
|
12965 |
+
"51877": {
|
12966 |
+
"content": "֝",
|
12967 |
+
"lstrip": false,
|
12968 |
+
"normalized": true,
|
12969 |
+
"rstrip": false,
|
12970 |
+
"single_word": false,
|
12971 |
+
"special": false
|
12972 |
+
},
|
12973 |
+
"51878": {
|
12974 |
+
"content": "֞",
|
12975 |
+
"lstrip": false,
|
12976 |
+
"normalized": true,
|
12977 |
+
"rstrip": false,
|
12978 |
+
"single_word": false,
|
12979 |
+
"special": false
|
12980 |
+
},
|
12981 |
+
"51879": {
|
12982 |
+
"content": "֟",
|
12983 |
+
"lstrip": false,
|
12984 |
+
"normalized": true,
|
12985 |
+
"rstrip": false,
|
12986 |
+
"single_word": false,
|
12987 |
+
"special": false
|
12988 |
+
},
|
12989 |
+
"51880": {
|
12990 |
+
"content": "֠",
|
12991 |
+
"lstrip": false,
|
12992 |
+
"normalized": true,
|
12993 |
+
"rstrip": false,
|
12994 |
+
"single_word": false,
|
12995 |
+
"special": false
|
12996 |
+
},
|
12997 |
+
"51881": {
|
12998 |
+
"content": "֡",
|
12999 |
+
"lstrip": false,
|
13000 |
+
"normalized": true,
|
13001 |
+
"rstrip": false,
|
13002 |
+
"single_word": false,
|
13003 |
+
"special": false
|
13004 |
+
},
|
13005 |
+
"51882": {
|
13006 |
+
"content": "֢",
|
13007 |
+
"lstrip": false,
|
13008 |
+
"normalized": true,
|
13009 |
+
"rstrip": false,
|
13010 |
+
"single_word": false,
|
13011 |
+
"special": false
|
13012 |
+
},
|
13013 |
+
"51883": {
|
13014 |
+
"content": "֣",
|
13015 |
+
"lstrip": false,
|
13016 |
+
"normalized": true,
|
13017 |
+
"rstrip": false,
|
13018 |
+
"single_word": false,
|
13019 |
+
"special": false
|
13020 |
+
},
|
13021 |
+
"51884": {
|
13022 |
+
"content": "֤",
|
13023 |
+
"lstrip": false,
|
13024 |
+
"normalized": true,
|
13025 |
+
"rstrip": false,
|
13026 |
+
"single_word": false,
|
13027 |
+
"special": false
|
13028 |
+
},
|
13029 |
+
"51885": {
|
13030 |
+
"content": "֥",
|
13031 |
+
"lstrip": false,
|
13032 |
+
"normalized": true,
|
13033 |
+
"rstrip": false,
|
13034 |
+
"single_word": false,
|
13035 |
+
"special": false
|
13036 |
+
},
|
13037 |
+
"51886": {
|
13038 |
+
"content": "֦",
|
13039 |
+
"lstrip": false,
|
13040 |
+
"normalized": true,
|
13041 |
+
"rstrip": false,
|
13042 |
+
"single_word": false,
|
13043 |
+
"special": false
|
13044 |
+
},
|
13045 |
+
"51887": {
|
13046 |
+
"content": "֧",
|
13047 |
+
"lstrip": false,
|
13048 |
+
"normalized": true,
|
13049 |
+
"rstrip": false,
|
13050 |
+
"single_word": false,
|
13051 |
+
"special": false
|
13052 |
+
},
|
13053 |
+
"51888": {
|
13054 |
+
"content": "֨",
|
13055 |
+
"lstrip": false,
|
13056 |
+
"normalized": true,
|
13057 |
+
"rstrip": false,
|
13058 |
+
"single_word": false,
|
13059 |
+
"special": false
|
13060 |
+
},
|
13061 |
+
"51889": {
|
13062 |
+
"content": "֩",
|
13063 |
+
"lstrip": false,
|
13064 |
+
"normalized": true,
|
13065 |
+
"rstrip": false,
|
13066 |
+
"single_word": false,
|
13067 |
+
"special": false
|
13068 |
+
},
|
13069 |
+
"51890": {
|
13070 |
+
"content": "֪",
|
13071 |
+
"lstrip": false,
|
13072 |
+
"normalized": true,
|
13073 |
+
"rstrip": false,
|
13074 |
+
"single_word": false,
|
13075 |
+
"special": false
|
13076 |
+
},
|
13077 |
+
"51891": {
|
13078 |
+
"content": "֫",
|
13079 |
+
"lstrip": false,
|
13080 |
+
"normalized": true,
|
13081 |
+
"rstrip": false,
|
13082 |
+
"single_word": false,
|
13083 |
+
"special": false
|
13084 |
+
},
|
13085 |
+
"51892": {
|
13086 |
+
"content": "֬",
|
13087 |
+
"lstrip": false,
|
13088 |
+
"normalized": true,
|
13089 |
+
"rstrip": false,
|
13090 |
+
"single_word": false,
|
13091 |
+
"special": false
|
13092 |
+
},
|
13093 |
+
"51893": {
|
13094 |
+
"content": "֭",
|
13095 |
+
"lstrip": false,
|
13096 |
+
"normalized": true,
|
13097 |
+
"rstrip": false,
|
13098 |
+
"single_word": false,
|
13099 |
+
"special": false
|
13100 |
+
},
|
13101 |
+
"51894": {
|
13102 |
+
"content": "֮",
|
13103 |
+
"lstrip": false,
|
13104 |
+
"normalized": true,
|
13105 |
+
"rstrip": false,
|
13106 |
+
"single_word": false,
|
13107 |
+
"special": false
|
13108 |
+
},
|
13109 |
+
"51895": {
|
13110 |
+
"content": "ֽ",
|
13111 |
+
"lstrip": false,
|
13112 |
+
"normalized": true,
|
13113 |
+
"rstrip": false,
|
13114 |
+
"single_word": false,
|
13115 |
+
"special": false
|
13116 |
}
|
13117 |
},
|
13118 |
"additional_special_tokens": [
|
training_args.bin
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 4475
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:9812ba4cbe334061017bf2f7182ab888b7ae14d0476e12140519cc2574234888
|
3 |
size 4475
|