luyotw commited on
Commit
388eae8
·
verified ·
1 Parent(s): 2551688

Upload folder using huggingface_hub

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. added_tokens.json +1609 -0
  2. checkpoint-1000/config.json +60 -0
  3. checkpoint-1000/generation_config.json +254 -0
  4. checkpoint-1000/model.safetensors +3 -0
  5. checkpoint-1000/optimizer.pt +3 -0
  6. checkpoint-1000/preprocessor_config.json +15 -0
  7. checkpoint-1000/rng_state.pth +3 -0
  8. checkpoint-1000/scheduler.pt +3 -0
  9. checkpoint-1000/trainer_state.json +323 -0
  10. checkpoint-1000/training_args.bin +3 -0
  11. checkpoint-2000/config.json +60 -0
  12. checkpoint-2000/generation_config.json +254 -0
  13. checkpoint-2000/model.safetensors +3 -0
  14. checkpoint-2000/optimizer.pt +3 -0
  15. checkpoint-2000/preprocessor_config.json +15 -0
  16. checkpoint-2000/rng_state.pth +3 -0
  17. checkpoint-2000/scheduler.pt +3 -0
  18. checkpoint-2000/trainer_state.json +612 -0
  19. checkpoint-2000/training_args.bin +3 -0
  20. checkpoint-3000/config.json +60 -0
  21. checkpoint-3000/generation_config.json +254 -0
  22. checkpoint-3000/model.safetensors +3 -0
  23. checkpoint-3000/optimizer.pt +3 -0
  24. checkpoint-3000/preprocessor_config.json +15 -0
  25. checkpoint-3000/rng_state.pth +3 -0
  26. checkpoint-3000/scheduler.pt +3 -0
  27. checkpoint-3000/trainer_state.json +901 -0
  28. checkpoint-3000/training_args.bin +3 -0
  29. checkpoint-4000/config.json +60 -0
  30. checkpoint-4000/generation_config.json +254 -0
  31. checkpoint-4000/model.safetensors +3 -0
  32. checkpoint-4000/optimizer.pt +3 -0
  33. checkpoint-4000/preprocessor_config.json +15 -0
  34. checkpoint-4000/rng_state.pth +3 -0
  35. checkpoint-4000/scheduler.pt +3 -0
  36. checkpoint-4000/trainer_state.json +1190 -0
  37. checkpoint-4000/training_args.bin +3 -0
  38. checkpoint-5000/config.json +60 -0
  39. checkpoint-5000/generation_config.json +254 -0
  40. checkpoint-5000/model.safetensors +3 -0
  41. checkpoint-5000/optimizer.pt +3 -0
  42. checkpoint-5000/preprocessor_config.json +15 -0
  43. checkpoint-5000/rng_state.pth +3 -0
  44. checkpoint-5000/scheduler.pt +3 -0
  45. checkpoint-5000/trainer_state.json +1479 -0
  46. checkpoint-5000/training_args.bin +3 -0
  47. config.json +60 -0
  48. generation_config.json +254 -0
  49. merges.txt +0 -0
  50. model.safetensors +3 -0
added_tokens.json ADDED
@@ -0,0 +1,1609 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "<|0.00|>": 50364,
3
+ "<|0.02|>": 50365,
4
+ "<|0.04|>": 50366,
5
+ "<|0.06|>": 50367,
6
+ "<|0.08|>": 50368,
7
+ "<|0.10|>": 50369,
8
+ "<|0.12|>": 50370,
9
+ "<|0.14|>": 50371,
10
+ "<|0.16|>": 50372,
11
+ "<|0.18|>": 50373,
12
+ "<|0.20|>": 50374,
13
+ "<|0.22|>": 50375,
14
+ "<|0.24|>": 50376,
15
+ "<|0.26|>": 50377,
16
+ "<|0.28|>": 50378,
17
+ "<|0.30|>": 50379,
18
+ "<|0.32|>": 50380,
19
+ "<|0.34|>": 50381,
20
+ "<|0.36|>": 50382,
21
+ "<|0.38|>": 50383,
22
+ "<|0.40|>": 50384,
23
+ "<|0.42|>": 50385,
24
+ "<|0.44|>": 50386,
25
+ "<|0.46|>": 50387,
26
+ "<|0.48|>": 50388,
27
+ "<|0.50|>": 50389,
28
+ "<|0.52|>": 50390,
29
+ "<|0.54|>": 50391,
30
+ "<|0.56|>": 50392,
31
+ "<|0.58|>": 50393,
32
+ "<|0.60|>": 50394,
33
+ "<|0.62|>": 50395,
34
+ "<|0.64|>": 50396,
35
+ "<|0.66|>": 50397,
36
+ "<|0.68|>": 50398,
37
+ "<|0.70|>": 50399,
38
+ "<|0.72|>": 50400,
39
+ "<|0.74|>": 50401,
40
+ "<|0.76|>": 50402,
41
+ "<|0.78|>": 50403,
42
+ "<|0.80|>": 50404,
43
+ "<|0.82|>": 50405,
44
+ "<|0.84|>": 50406,
45
+ "<|0.86|>": 50407,
46
+ "<|0.88|>": 50408,
47
+ "<|0.90|>": 50409,
48
+ "<|0.92|>": 50410,
49
+ "<|0.94|>": 50411,
50
+ "<|0.96|>": 50412,
51
+ "<|0.98|>": 50413,
52
+ "<|1.00|>": 50414,
53
+ "<|1.02|>": 50415,
54
+ "<|1.04|>": 50416,
55
+ "<|1.06|>": 50417,
56
+ "<|1.08|>": 50418,
57
+ "<|1.10|>": 50419,
58
+ "<|1.12|>": 50420,
59
+ "<|1.14|>": 50421,
60
+ "<|1.16|>": 50422,
61
+ "<|1.18|>": 50423,
62
+ "<|1.20|>": 50424,
63
+ "<|1.22|>": 50425,
64
+ "<|1.24|>": 50426,
65
+ "<|1.26|>": 50427,
66
+ "<|1.28|>": 50428,
67
+ "<|1.30|>": 50429,
68
+ "<|1.32|>": 50430,
69
+ "<|1.34|>": 50431,
70
+ "<|1.36|>": 50432,
71
+ "<|1.38|>": 50433,
72
+ "<|1.40|>": 50434,
73
+ "<|1.42|>": 50435,
74
+ "<|1.44|>": 50436,
75
+ "<|1.46|>": 50437,
76
+ "<|1.48|>": 50438,
77
+ "<|1.50|>": 50439,
78
+ "<|1.52|>": 50440,
79
+ "<|1.54|>": 50441,
80
+ "<|1.56|>": 50442,
81
+ "<|1.58|>": 50443,
82
+ "<|1.60|>": 50444,
83
+ "<|1.62|>": 50445,
84
+ "<|1.64|>": 50446,
85
+ "<|1.66|>": 50447,
86
+ "<|1.68|>": 50448,
87
+ "<|1.70|>": 50449,
88
+ "<|1.72|>": 50450,
89
+ "<|1.74|>": 50451,
90
+ "<|1.76|>": 50452,
91
+ "<|1.78|>": 50453,
92
+ "<|1.80|>": 50454,
93
+ "<|1.82|>": 50455,
94
+ "<|1.84|>": 50456,
95
+ "<|1.86|>": 50457,
96
+ "<|1.88|>": 50458,
97
+ "<|1.90|>": 50459,
98
+ "<|1.92|>": 50460,
99
+ "<|1.94|>": 50461,
100
+ "<|1.96|>": 50462,
101
+ "<|1.98|>": 50463,
102
+ "<|10.00|>": 50864,
103
+ "<|10.02|>": 50865,
104
+ "<|10.04|>": 50866,
105
+ "<|10.06|>": 50867,
106
+ "<|10.08|>": 50868,
107
+ "<|10.10|>": 50869,
108
+ "<|10.12|>": 50870,
109
+ "<|10.14|>": 50871,
110
+ "<|10.16|>": 50872,
111
+ "<|10.18|>": 50873,
112
+ "<|10.20|>": 50874,
113
+ "<|10.22|>": 50875,
114
+ "<|10.24|>": 50876,
115
+ "<|10.26|>": 50877,
116
+ "<|10.28|>": 50878,
117
+ "<|10.30|>": 50879,
118
+ "<|10.32|>": 50880,
119
+ "<|10.34|>": 50881,
120
+ "<|10.36|>": 50882,
121
+ "<|10.38|>": 50883,
122
+ "<|10.40|>": 50884,
123
+ "<|10.42|>": 50885,
124
+ "<|10.44|>": 50886,
125
+ "<|10.46|>": 50887,
126
+ "<|10.48|>": 50888,
127
+ "<|10.50|>": 50889,
128
+ "<|10.52|>": 50890,
129
+ "<|10.54|>": 50891,
130
+ "<|10.56|>": 50892,
131
+ "<|10.58|>": 50893,
132
+ "<|10.60|>": 50894,
133
+ "<|10.62|>": 50895,
134
+ "<|10.64|>": 50896,
135
+ "<|10.66|>": 50897,
136
+ "<|10.68|>": 50898,
137
+ "<|10.70|>": 50899,
138
+ "<|10.72|>": 50900,
139
+ "<|10.74|>": 50901,
140
+ "<|10.76|>": 50902,
141
+ "<|10.78|>": 50903,
142
+ "<|10.80|>": 50904,
143
+ "<|10.82|>": 50905,
144
+ "<|10.84|>": 50906,
145
+ "<|10.86|>": 50907,
146
+ "<|10.88|>": 50908,
147
+ "<|10.90|>": 50909,
148
+ "<|10.92|>": 50910,
149
+ "<|10.94|>": 50911,
150
+ "<|10.96|>": 50912,
151
+ "<|10.98|>": 50913,
152
+ "<|11.00|>": 50914,
153
+ "<|11.02|>": 50915,
154
+ "<|11.04|>": 50916,
155
+ "<|11.06|>": 50917,
156
+ "<|11.08|>": 50918,
157
+ "<|11.10|>": 50919,
158
+ "<|11.12|>": 50920,
159
+ "<|11.14|>": 50921,
160
+ "<|11.16|>": 50922,
161
+ "<|11.18|>": 50923,
162
+ "<|11.20|>": 50924,
163
+ "<|11.22|>": 50925,
164
+ "<|11.24|>": 50926,
165
+ "<|11.26|>": 50927,
166
+ "<|11.28|>": 50928,
167
+ "<|11.30|>": 50929,
168
+ "<|11.32|>": 50930,
169
+ "<|11.34|>": 50931,
170
+ "<|11.36|>": 50932,
171
+ "<|11.38|>": 50933,
172
+ "<|11.40|>": 50934,
173
+ "<|11.42|>": 50935,
174
+ "<|11.44|>": 50936,
175
+ "<|11.46|>": 50937,
176
+ "<|11.48|>": 50938,
177
+ "<|11.50|>": 50939,
178
+ "<|11.52|>": 50940,
179
+ "<|11.54|>": 50941,
180
+ "<|11.56|>": 50942,
181
+ "<|11.58|>": 50943,
182
+ "<|11.60|>": 50944,
183
+ "<|11.62|>": 50945,
184
+ "<|11.64|>": 50946,
185
+ "<|11.66|>": 50947,
186
+ "<|11.68|>": 50948,
187
+ "<|11.70|>": 50949,
188
+ "<|11.72|>": 50950,
189
+ "<|11.74|>": 50951,
190
+ "<|11.76|>": 50952,
191
+ "<|11.78|>": 50953,
192
+ "<|11.80|>": 50954,
193
+ "<|11.82|>": 50955,
194
+ "<|11.84|>": 50956,
195
+ "<|11.86|>": 50957,
196
+ "<|11.88|>": 50958,
197
+ "<|11.90|>": 50959,
198
+ "<|11.92|>": 50960,
199
+ "<|11.94|>": 50961,
200
+ "<|11.96|>": 50962,
201
+ "<|11.98|>": 50963,
202
+ "<|12.00|>": 50964,
203
+ "<|12.02|>": 50965,
204
+ "<|12.04|>": 50966,
205
+ "<|12.06|>": 50967,
206
+ "<|12.08|>": 50968,
207
+ "<|12.10|>": 50969,
208
+ "<|12.12|>": 50970,
209
+ "<|12.14|>": 50971,
210
+ "<|12.16|>": 50972,
211
+ "<|12.18|>": 50973,
212
+ "<|12.20|>": 50974,
213
+ "<|12.22|>": 50975,
214
+ "<|12.24|>": 50976,
215
+ "<|12.26|>": 50977,
216
+ "<|12.28|>": 50978,
217
+ "<|12.30|>": 50979,
218
+ "<|12.32|>": 50980,
219
+ "<|12.34|>": 50981,
220
+ "<|12.36|>": 50982,
221
+ "<|12.38|>": 50983,
222
+ "<|12.40|>": 50984,
223
+ "<|12.42|>": 50985,
224
+ "<|12.44|>": 50986,
225
+ "<|12.46|>": 50987,
226
+ "<|12.48|>": 50988,
227
+ "<|12.50|>": 50989,
228
+ "<|12.52|>": 50990,
229
+ "<|12.54|>": 50991,
230
+ "<|12.56|>": 50992,
231
+ "<|12.58|>": 50993,
232
+ "<|12.60|>": 50994,
233
+ "<|12.62|>": 50995,
234
+ "<|12.64|>": 50996,
235
+ "<|12.66|>": 50997,
236
+ "<|12.68|>": 50998,
237
+ "<|12.70|>": 50999,
238
+ "<|12.72|>": 51000,
239
+ "<|12.74|>": 51001,
240
+ "<|12.76|>": 51002,
241
+ "<|12.78|>": 51003,
242
+ "<|12.80|>": 51004,
243
+ "<|12.82|>": 51005,
244
+ "<|12.84|>": 51006,
245
+ "<|12.86|>": 51007,
246
+ "<|12.88|>": 51008,
247
+ "<|12.90|>": 51009,
248
+ "<|12.92|>": 51010,
249
+ "<|12.94|>": 51011,
250
+ "<|12.96|>": 51012,
251
+ "<|12.98|>": 51013,
252
+ "<|13.00|>": 51014,
253
+ "<|13.02|>": 51015,
254
+ "<|13.04|>": 51016,
255
+ "<|13.06|>": 51017,
256
+ "<|13.08|>": 51018,
257
+ "<|13.10|>": 51019,
258
+ "<|13.12|>": 51020,
259
+ "<|13.14|>": 51021,
260
+ "<|13.16|>": 51022,
261
+ "<|13.18|>": 51023,
262
+ "<|13.20|>": 51024,
263
+ "<|13.22|>": 51025,
264
+ "<|13.24|>": 51026,
265
+ "<|13.26|>": 51027,
266
+ "<|13.28|>": 51028,
267
+ "<|13.30|>": 51029,
268
+ "<|13.32|>": 51030,
269
+ "<|13.34|>": 51031,
270
+ "<|13.36|>": 51032,
271
+ "<|13.38|>": 51033,
272
+ "<|13.40|>": 51034,
273
+ "<|13.42|>": 51035,
274
+ "<|13.44|>": 51036,
275
+ "<|13.46|>": 51037,
276
+ "<|13.48|>": 51038,
277
+ "<|13.50|>": 51039,
278
+ "<|13.52|>": 51040,
279
+ "<|13.54|>": 51041,
280
+ "<|13.56|>": 51042,
281
+ "<|13.58|>": 51043,
282
+ "<|13.60|>": 51044,
283
+ "<|13.62|>": 51045,
284
+ "<|13.64|>": 51046,
285
+ "<|13.66|>": 51047,
286
+ "<|13.68|>": 51048,
287
+ "<|13.70|>": 51049,
288
+ "<|13.72|>": 51050,
289
+ "<|13.74|>": 51051,
290
+ "<|13.76|>": 51052,
291
+ "<|13.78|>": 51053,
292
+ "<|13.80|>": 51054,
293
+ "<|13.82|>": 51055,
294
+ "<|13.84|>": 51056,
295
+ "<|13.86|>": 51057,
296
+ "<|13.88|>": 51058,
297
+ "<|13.90|>": 51059,
298
+ "<|13.92|>": 51060,
299
+ "<|13.94|>": 51061,
300
+ "<|13.96|>": 51062,
301
+ "<|13.98|>": 51063,
302
+ "<|14.00|>": 51064,
303
+ "<|14.02|>": 51065,
304
+ "<|14.04|>": 51066,
305
+ "<|14.06|>": 51067,
306
+ "<|14.08|>": 51068,
307
+ "<|14.10|>": 51069,
308
+ "<|14.12|>": 51070,
309
+ "<|14.14|>": 51071,
310
+ "<|14.16|>": 51072,
311
+ "<|14.18|>": 51073,
312
+ "<|14.20|>": 51074,
313
+ "<|14.22|>": 51075,
314
+ "<|14.24|>": 51076,
315
+ "<|14.26|>": 51077,
316
+ "<|14.28|>": 51078,
317
+ "<|14.30|>": 51079,
318
+ "<|14.32|>": 51080,
319
+ "<|14.34|>": 51081,
320
+ "<|14.36|>": 51082,
321
+ "<|14.38|>": 51083,
322
+ "<|14.40|>": 51084,
323
+ "<|14.42|>": 51085,
324
+ "<|14.44|>": 51086,
325
+ "<|14.46|>": 51087,
326
+ "<|14.48|>": 51088,
327
+ "<|14.50|>": 51089,
328
+ "<|14.52|>": 51090,
329
+ "<|14.54|>": 51091,
330
+ "<|14.56|>": 51092,
331
+ "<|14.58|>": 51093,
332
+ "<|14.60|>": 51094,
333
+ "<|14.62|>": 51095,
334
+ "<|14.64|>": 51096,
335
+ "<|14.66|>": 51097,
336
+ "<|14.68|>": 51098,
337
+ "<|14.70|>": 51099,
338
+ "<|14.72|>": 51100,
339
+ "<|14.74|>": 51101,
340
+ "<|14.76|>": 51102,
341
+ "<|14.78|>": 51103,
342
+ "<|14.80|>": 51104,
343
+ "<|14.82|>": 51105,
344
+ "<|14.84|>": 51106,
345
+ "<|14.86|>": 51107,
346
+ "<|14.88|>": 51108,
347
+ "<|14.90|>": 51109,
348
+ "<|14.92|>": 51110,
349
+ "<|14.94|>": 51111,
350
+ "<|14.96|>": 51112,
351
+ "<|14.98|>": 51113,
352
+ "<|15.00|>": 51114,
353
+ "<|15.02|>": 51115,
354
+ "<|15.04|>": 51116,
355
+ "<|15.06|>": 51117,
356
+ "<|15.08|>": 51118,
357
+ "<|15.10|>": 51119,
358
+ "<|15.12|>": 51120,
359
+ "<|15.14|>": 51121,
360
+ "<|15.16|>": 51122,
361
+ "<|15.18|>": 51123,
362
+ "<|15.20|>": 51124,
363
+ "<|15.22|>": 51125,
364
+ "<|15.24|>": 51126,
365
+ "<|15.26|>": 51127,
366
+ "<|15.28|>": 51128,
367
+ "<|15.30|>": 51129,
368
+ "<|15.32|>": 51130,
369
+ "<|15.34|>": 51131,
370
+ "<|15.36|>": 51132,
371
+ "<|15.38|>": 51133,
372
+ "<|15.40|>": 51134,
373
+ "<|15.42|>": 51135,
374
+ "<|15.44|>": 51136,
375
+ "<|15.46|>": 51137,
376
+ "<|15.48|>": 51138,
377
+ "<|15.50|>": 51139,
378
+ "<|15.52|>": 51140,
379
+ "<|15.54|>": 51141,
380
+ "<|15.56|>": 51142,
381
+ "<|15.58|>": 51143,
382
+ "<|15.60|>": 51144,
383
+ "<|15.62|>": 51145,
384
+ "<|15.64|>": 51146,
385
+ "<|15.66|>": 51147,
386
+ "<|15.68|>": 51148,
387
+ "<|15.70|>": 51149,
388
+ "<|15.72|>": 51150,
389
+ "<|15.74|>": 51151,
390
+ "<|15.76|>": 51152,
391
+ "<|15.78|>": 51153,
392
+ "<|15.80|>": 51154,
393
+ "<|15.82|>": 51155,
394
+ "<|15.84|>": 51156,
395
+ "<|15.86|>": 51157,
396
+ "<|15.88|>": 51158,
397
+ "<|15.90|>": 51159,
398
+ "<|15.92|>": 51160,
399
+ "<|15.94|>": 51161,
400
+ "<|15.96|>": 51162,
401
+ "<|15.98|>": 51163,
402
+ "<|16.00|>": 51164,
403
+ "<|16.02|>": 51165,
404
+ "<|16.04|>": 51166,
405
+ "<|16.06|>": 51167,
406
+ "<|16.08|>": 51168,
407
+ "<|16.10|>": 51169,
408
+ "<|16.12|>": 51170,
409
+ "<|16.14|>": 51171,
410
+ "<|16.16|>": 51172,
411
+ "<|16.18|>": 51173,
412
+ "<|16.20|>": 51174,
413
+ "<|16.22|>": 51175,
414
+ "<|16.24|>": 51176,
415
+ "<|16.26|>": 51177,
416
+ "<|16.28|>": 51178,
417
+ "<|16.30|>": 51179,
418
+ "<|16.32|>": 51180,
419
+ "<|16.34|>": 51181,
420
+ "<|16.36|>": 51182,
421
+ "<|16.38|>": 51183,
422
+ "<|16.40|>": 51184,
423
+ "<|16.42|>": 51185,
424
+ "<|16.44|>": 51186,
425
+ "<|16.46|>": 51187,
426
+ "<|16.48|>": 51188,
427
+ "<|16.50|>": 51189,
428
+ "<|16.52|>": 51190,
429
+ "<|16.54|>": 51191,
430
+ "<|16.56|>": 51192,
431
+ "<|16.58|>": 51193,
432
+ "<|16.60|>": 51194,
433
+ "<|16.62|>": 51195,
434
+ "<|16.64|>": 51196,
435
+ "<|16.66|>": 51197,
436
+ "<|16.68|>": 51198,
437
+ "<|16.70|>": 51199,
438
+ "<|16.72|>": 51200,
439
+ "<|16.74|>": 51201,
440
+ "<|16.76|>": 51202,
441
+ "<|16.78|>": 51203,
442
+ "<|16.80|>": 51204,
443
+ "<|16.82|>": 51205,
444
+ "<|16.84|>": 51206,
445
+ "<|16.86|>": 51207,
446
+ "<|16.88|>": 51208,
447
+ "<|16.90|>": 51209,
448
+ "<|16.92|>": 51210,
449
+ "<|16.94|>": 51211,
450
+ "<|16.96|>": 51212,
451
+ "<|16.98|>": 51213,
452
+ "<|17.00|>": 51214,
453
+ "<|17.02|>": 51215,
454
+ "<|17.04|>": 51216,
455
+ "<|17.06|>": 51217,
456
+ "<|17.08|>": 51218,
457
+ "<|17.10|>": 51219,
458
+ "<|17.12|>": 51220,
459
+ "<|17.14|>": 51221,
460
+ "<|17.16|>": 51222,
461
+ "<|17.18|>": 51223,
462
+ "<|17.20|>": 51224,
463
+ "<|17.22|>": 51225,
464
+ "<|17.24|>": 51226,
465
+ "<|17.26|>": 51227,
466
+ "<|17.28|>": 51228,
467
+ "<|17.30|>": 51229,
468
+ "<|17.32|>": 51230,
469
+ "<|17.34|>": 51231,
470
+ "<|17.36|>": 51232,
471
+ "<|17.38|>": 51233,
472
+ "<|17.40|>": 51234,
473
+ "<|17.42|>": 51235,
474
+ "<|17.44|>": 51236,
475
+ "<|17.46|>": 51237,
476
+ "<|17.48|>": 51238,
477
+ "<|17.50|>": 51239,
478
+ "<|17.52|>": 51240,
479
+ "<|17.54|>": 51241,
480
+ "<|17.56|>": 51242,
481
+ "<|17.58|>": 51243,
482
+ "<|17.60|>": 51244,
483
+ "<|17.62|>": 51245,
484
+ "<|17.64|>": 51246,
485
+ "<|17.66|>": 51247,
486
+ "<|17.68|>": 51248,
487
+ "<|17.70|>": 51249,
488
+ "<|17.72|>": 51250,
489
+ "<|17.74|>": 51251,
490
+ "<|17.76|>": 51252,
491
+ "<|17.78|>": 51253,
492
+ "<|17.80|>": 51254,
493
+ "<|17.82|>": 51255,
494
+ "<|17.84|>": 51256,
495
+ "<|17.86|>": 51257,
496
+ "<|17.88|>": 51258,
497
+ "<|17.90|>": 51259,
498
+ "<|17.92|>": 51260,
499
+ "<|17.94|>": 51261,
500
+ "<|17.96|>": 51262,
501
+ "<|17.98|>": 51263,
502
+ "<|18.00|>": 51264,
503
+ "<|18.02|>": 51265,
504
+ "<|18.04|>": 51266,
505
+ "<|18.06|>": 51267,
506
+ "<|18.08|>": 51268,
507
+ "<|18.10|>": 51269,
508
+ "<|18.12|>": 51270,
509
+ "<|18.14|>": 51271,
510
+ "<|18.16|>": 51272,
511
+ "<|18.18|>": 51273,
512
+ "<|18.20|>": 51274,
513
+ "<|18.22|>": 51275,
514
+ "<|18.24|>": 51276,
515
+ "<|18.26|>": 51277,
516
+ "<|18.28|>": 51278,
517
+ "<|18.30|>": 51279,
518
+ "<|18.32|>": 51280,
519
+ "<|18.34|>": 51281,
520
+ "<|18.36|>": 51282,
521
+ "<|18.38|>": 51283,
522
+ "<|18.40|>": 51284,
523
+ "<|18.42|>": 51285,
524
+ "<|18.44|>": 51286,
525
+ "<|18.46|>": 51287,
526
+ "<|18.48|>": 51288,
527
+ "<|18.50|>": 51289,
528
+ "<|18.52|>": 51290,
529
+ "<|18.54|>": 51291,
530
+ "<|18.56|>": 51292,
531
+ "<|18.58|>": 51293,
532
+ "<|18.60|>": 51294,
533
+ "<|18.62|>": 51295,
534
+ "<|18.64|>": 51296,
535
+ "<|18.66|>": 51297,
536
+ "<|18.68|>": 51298,
537
+ "<|18.70|>": 51299,
538
+ "<|18.72|>": 51300,
539
+ "<|18.74|>": 51301,
540
+ "<|18.76|>": 51302,
541
+ "<|18.78|>": 51303,
542
+ "<|18.80|>": 51304,
543
+ "<|18.82|>": 51305,
544
+ "<|18.84|>": 51306,
545
+ "<|18.86|>": 51307,
546
+ "<|18.88|>": 51308,
547
+ "<|18.90|>": 51309,
548
+ "<|18.92|>": 51310,
549
+ "<|18.94|>": 51311,
550
+ "<|18.96|>": 51312,
551
+ "<|18.98|>": 51313,
552
+ "<|19.00|>": 51314,
553
+ "<|19.02|>": 51315,
554
+ "<|19.04|>": 51316,
555
+ "<|19.06|>": 51317,
556
+ "<|19.08|>": 51318,
557
+ "<|19.10|>": 51319,
558
+ "<|19.12|>": 51320,
559
+ "<|19.14|>": 51321,
560
+ "<|19.16|>": 51322,
561
+ "<|19.18|>": 51323,
562
+ "<|19.20|>": 51324,
563
+ "<|19.22|>": 51325,
564
+ "<|19.24|>": 51326,
565
+ "<|19.26|>": 51327,
566
+ "<|19.28|>": 51328,
567
+ "<|19.30|>": 51329,
568
+ "<|19.32|>": 51330,
569
+ "<|19.34|>": 51331,
570
+ "<|19.36|>": 51332,
571
+ "<|19.38|>": 51333,
572
+ "<|19.40|>": 51334,
573
+ "<|19.42|>": 51335,
574
+ "<|19.44|>": 51336,
575
+ "<|19.46|>": 51337,
576
+ "<|19.48|>": 51338,
577
+ "<|19.50|>": 51339,
578
+ "<|19.52|>": 51340,
579
+ "<|19.54|>": 51341,
580
+ "<|19.56|>": 51342,
581
+ "<|19.58|>": 51343,
582
+ "<|19.60|>": 51344,
583
+ "<|19.62|>": 51345,
584
+ "<|19.64|>": 51346,
585
+ "<|19.66|>": 51347,
586
+ "<|19.68|>": 51348,
587
+ "<|19.70|>": 51349,
588
+ "<|19.72|>": 51350,
589
+ "<|19.74|>": 51351,
590
+ "<|19.76|>": 51352,
591
+ "<|19.78|>": 51353,
592
+ "<|19.80|>": 51354,
593
+ "<|19.82|>": 51355,
594
+ "<|19.84|>": 51356,
595
+ "<|19.86|>": 51357,
596
+ "<|19.88|>": 51358,
597
+ "<|19.90|>": 51359,
598
+ "<|19.92|>": 51360,
599
+ "<|19.94|>": 51361,
600
+ "<|19.96|>": 51362,
601
+ "<|19.98|>": 51363,
602
+ "<|2.00|>": 50464,
603
+ "<|2.02|>": 50465,
604
+ "<|2.04|>": 50466,
605
+ "<|2.06|>": 50467,
606
+ "<|2.08|>": 50468,
607
+ "<|2.10|>": 50469,
608
+ "<|2.12|>": 50470,
609
+ "<|2.14|>": 50471,
610
+ "<|2.16|>": 50472,
611
+ "<|2.18|>": 50473,
612
+ "<|2.20|>": 50474,
613
+ "<|2.22|>": 50475,
614
+ "<|2.24|>": 50476,
615
+ "<|2.26|>": 50477,
616
+ "<|2.28|>": 50478,
617
+ "<|2.30|>": 50479,
618
+ "<|2.32|>": 50480,
619
+ "<|2.34|>": 50481,
620
+ "<|2.36|>": 50482,
621
+ "<|2.38|>": 50483,
622
+ "<|2.40|>": 50484,
623
+ "<|2.42|>": 50485,
624
+ "<|2.44|>": 50486,
625
+ "<|2.46|>": 50487,
626
+ "<|2.48|>": 50488,
627
+ "<|2.50|>": 50489,
628
+ "<|2.52|>": 50490,
629
+ "<|2.54|>": 50491,
630
+ "<|2.56|>": 50492,
631
+ "<|2.58|>": 50493,
632
+ "<|2.60|>": 50494,
633
+ "<|2.62|>": 50495,
634
+ "<|2.64|>": 50496,
635
+ "<|2.66|>": 50497,
636
+ "<|2.68|>": 50498,
637
+ "<|2.70|>": 50499,
638
+ "<|2.72|>": 50500,
639
+ "<|2.74|>": 50501,
640
+ "<|2.76|>": 50502,
641
+ "<|2.78|>": 50503,
642
+ "<|2.80|>": 50504,
643
+ "<|2.82|>": 50505,
644
+ "<|2.84|>": 50506,
645
+ "<|2.86|>": 50507,
646
+ "<|2.88|>": 50508,
647
+ "<|2.90|>": 50509,
648
+ "<|2.92|>": 50510,
649
+ "<|2.94|>": 50511,
650
+ "<|2.96|>": 50512,
651
+ "<|2.98|>": 50513,
652
+ "<|20.00|>": 51364,
653
+ "<|20.02|>": 51365,
654
+ "<|20.04|>": 51366,
655
+ "<|20.06|>": 51367,
656
+ "<|20.08|>": 51368,
657
+ "<|20.10|>": 51369,
658
+ "<|20.12|>": 51370,
659
+ "<|20.14|>": 51371,
660
+ "<|20.16|>": 51372,
661
+ "<|20.18|>": 51373,
662
+ "<|20.20|>": 51374,
663
+ "<|20.22|>": 51375,
664
+ "<|20.24|>": 51376,
665
+ "<|20.26|>": 51377,
666
+ "<|20.28|>": 51378,
667
+ "<|20.30|>": 51379,
668
+ "<|20.32|>": 51380,
669
+ "<|20.34|>": 51381,
670
+ "<|20.36|>": 51382,
671
+ "<|20.38|>": 51383,
672
+ "<|20.40|>": 51384,
673
+ "<|20.42|>": 51385,
674
+ "<|20.44|>": 51386,
675
+ "<|20.46|>": 51387,
676
+ "<|20.48|>": 51388,
677
+ "<|20.50|>": 51389,
678
+ "<|20.52|>": 51390,
679
+ "<|20.54|>": 51391,
680
+ "<|20.56|>": 51392,
681
+ "<|20.58|>": 51393,
682
+ "<|20.60|>": 51394,
683
+ "<|20.62|>": 51395,
684
+ "<|20.64|>": 51396,
685
+ "<|20.66|>": 51397,
686
+ "<|20.68|>": 51398,
687
+ "<|20.70|>": 51399,
688
+ "<|20.72|>": 51400,
689
+ "<|20.74|>": 51401,
690
+ "<|20.76|>": 51402,
691
+ "<|20.78|>": 51403,
692
+ "<|20.80|>": 51404,
693
+ "<|20.82|>": 51405,
694
+ "<|20.84|>": 51406,
695
+ "<|20.86|>": 51407,
696
+ "<|20.88|>": 51408,
697
+ "<|20.90|>": 51409,
698
+ "<|20.92|>": 51410,
699
+ "<|20.94|>": 51411,
700
+ "<|20.96|>": 51412,
701
+ "<|20.98|>": 51413,
702
+ "<|21.00|>": 51414,
703
+ "<|21.02|>": 51415,
704
+ "<|21.04|>": 51416,
705
+ "<|21.06|>": 51417,
706
+ "<|21.08|>": 51418,
707
+ "<|21.10|>": 51419,
708
+ "<|21.12|>": 51420,
709
+ "<|21.14|>": 51421,
710
+ "<|21.16|>": 51422,
711
+ "<|21.18|>": 51423,
712
+ "<|21.20|>": 51424,
713
+ "<|21.22|>": 51425,
714
+ "<|21.24|>": 51426,
715
+ "<|21.26|>": 51427,
716
+ "<|21.28|>": 51428,
717
+ "<|21.30|>": 51429,
718
+ "<|21.32|>": 51430,
719
+ "<|21.34|>": 51431,
720
+ "<|21.36|>": 51432,
721
+ "<|21.38|>": 51433,
722
+ "<|21.40|>": 51434,
723
+ "<|21.42|>": 51435,
724
+ "<|21.44|>": 51436,
725
+ "<|21.46|>": 51437,
726
+ "<|21.48|>": 51438,
727
+ "<|21.50|>": 51439,
728
+ "<|21.52|>": 51440,
729
+ "<|21.54|>": 51441,
730
+ "<|21.56|>": 51442,
731
+ "<|21.58|>": 51443,
732
+ "<|21.60|>": 51444,
733
+ "<|21.62|>": 51445,
734
+ "<|21.64|>": 51446,
735
+ "<|21.66|>": 51447,
736
+ "<|21.68|>": 51448,
737
+ "<|21.70|>": 51449,
738
+ "<|21.72|>": 51450,
739
+ "<|21.74|>": 51451,
740
+ "<|21.76|>": 51452,
741
+ "<|21.78|>": 51453,
742
+ "<|21.80|>": 51454,
743
+ "<|21.82|>": 51455,
744
+ "<|21.84|>": 51456,
745
+ "<|21.86|>": 51457,
746
+ "<|21.88|>": 51458,
747
+ "<|21.90|>": 51459,
748
+ "<|21.92|>": 51460,
749
+ "<|21.94|>": 51461,
750
+ "<|21.96|>": 51462,
751
+ "<|21.98|>": 51463,
752
+ "<|22.00|>": 51464,
753
+ "<|22.02|>": 51465,
754
+ "<|22.04|>": 51466,
755
+ "<|22.06|>": 51467,
756
+ "<|22.08|>": 51468,
757
+ "<|22.10|>": 51469,
758
+ "<|22.12|>": 51470,
759
+ "<|22.14|>": 51471,
760
+ "<|22.16|>": 51472,
761
+ "<|22.18|>": 51473,
762
+ "<|22.20|>": 51474,
763
+ "<|22.22|>": 51475,
764
+ "<|22.24|>": 51476,
765
+ "<|22.26|>": 51477,
766
+ "<|22.28|>": 51478,
767
+ "<|22.30|>": 51479,
768
+ "<|22.32|>": 51480,
769
+ "<|22.34|>": 51481,
770
+ "<|22.36|>": 51482,
771
+ "<|22.38|>": 51483,
772
+ "<|22.40|>": 51484,
773
+ "<|22.42|>": 51485,
774
+ "<|22.44|>": 51486,
775
+ "<|22.46|>": 51487,
776
+ "<|22.48|>": 51488,
777
+ "<|22.50|>": 51489,
778
+ "<|22.52|>": 51490,
779
+ "<|22.54|>": 51491,
780
+ "<|22.56|>": 51492,
781
+ "<|22.58|>": 51493,
782
+ "<|22.60|>": 51494,
783
+ "<|22.62|>": 51495,
784
+ "<|22.64|>": 51496,
785
+ "<|22.66|>": 51497,
786
+ "<|22.68|>": 51498,
787
+ "<|22.70|>": 51499,
788
+ "<|22.72|>": 51500,
789
+ "<|22.74|>": 51501,
790
+ "<|22.76|>": 51502,
791
+ "<|22.78|>": 51503,
792
+ "<|22.80|>": 51504,
793
+ "<|22.82|>": 51505,
794
+ "<|22.84|>": 51506,
795
+ "<|22.86|>": 51507,
796
+ "<|22.88|>": 51508,
797
+ "<|22.90|>": 51509,
798
+ "<|22.92|>": 51510,
799
+ "<|22.94|>": 51511,
800
+ "<|22.96|>": 51512,
801
+ "<|22.98|>": 51513,
802
+ "<|23.00|>": 51514,
803
+ "<|23.02|>": 51515,
804
+ "<|23.04|>": 51516,
805
+ "<|23.06|>": 51517,
806
+ "<|23.08|>": 51518,
807
+ "<|23.10|>": 51519,
808
+ "<|23.12|>": 51520,
809
+ "<|23.14|>": 51521,
810
+ "<|23.16|>": 51522,
811
+ "<|23.18|>": 51523,
812
+ "<|23.20|>": 51524,
813
+ "<|23.22|>": 51525,
814
+ "<|23.24|>": 51526,
815
+ "<|23.26|>": 51527,
816
+ "<|23.28|>": 51528,
817
+ "<|23.30|>": 51529,
818
+ "<|23.32|>": 51530,
819
+ "<|23.34|>": 51531,
820
+ "<|23.36|>": 51532,
821
+ "<|23.38|>": 51533,
822
+ "<|23.40|>": 51534,
823
+ "<|23.42|>": 51535,
824
+ "<|23.44|>": 51536,
825
+ "<|23.46|>": 51537,
826
+ "<|23.48|>": 51538,
827
+ "<|23.50|>": 51539,
828
+ "<|23.52|>": 51540,
829
+ "<|23.54|>": 51541,
830
+ "<|23.56|>": 51542,
831
+ "<|23.58|>": 51543,
832
+ "<|23.60|>": 51544,
833
+ "<|23.62|>": 51545,
834
+ "<|23.64|>": 51546,
835
+ "<|23.66|>": 51547,
836
+ "<|23.68|>": 51548,
837
+ "<|23.70|>": 51549,
838
+ "<|23.72|>": 51550,
839
+ "<|23.74|>": 51551,
840
+ "<|23.76|>": 51552,
841
+ "<|23.78|>": 51553,
842
+ "<|23.80|>": 51554,
843
+ "<|23.82|>": 51555,
844
+ "<|23.84|>": 51556,
845
+ "<|23.86|>": 51557,
846
+ "<|23.88|>": 51558,
847
+ "<|23.90|>": 51559,
848
+ "<|23.92|>": 51560,
849
+ "<|23.94|>": 51561,
850
+ "<|23.96|>": 51562,
851
+ "<|23.98|>": 51563,
852
+ "<|24.00|>": 51564,
853
+ "<|24.02|>": 51565,
854
+ "<|24.04|>": 51566,
855
+ "<|24.06|>": 51567,
856
+ "<|24.08|>": 51568,
857
+ "<|24.10|>": 51569,
858
+ "<|24.12|>": 51570,
859
+ "<|24.14|>": 51571,
860
+ "<|24.16|>": 51572,
861
+ "<|24.18|>": 51573,
862
+ "<|24.20|>": 51574,
863
+ "<|24.22|>": 51575,
864
+ "<|24.24|>": 51576,
865
+ "<|24.26|>": 51577,
866
+ "<|24.28|>": 51578,
867
+ "<|24.30|>": 51579,
868
+ "<|24.32|>": 51580,
869
+ "<|24.34|>": 51581,
870
+ "<|24.36|>": 51582,
871
+ "<|24.38|>": 51583,
872
+ "<|24.40|>": 51584,
873
+ "<|24.42|>": 51585,
874
+ "<|24.44|>": 51586,
875
+ "<|24.46|>": 51587,
876
+ "<|24.48|>": 51588,
877
+ "<|24.50|>": 51589,
878
+ "<|24.52|>": 51590,
879
+ "<|24.54|>": 51591,
880
+ "<|24.56|>": 51592,
881
+ "<|24.58|>": 51593,
882
+ "<|24.60|>": 51594,
883
+ "<|24.62|>": 51595,
884
+ "<|24.64|>": 51596,
885
+ "<|24.66|>": 51597,
886
+ "<|24.68|>": 51598,
887
+ "<|24.70|>": 51599,
888
+ "<|24.72|>": 51600,
889
+ "<|24.74|>": 51601,
890
+ "<|24.76|>": 51602,
891
+ "<|24.78|>": 51603,
892
+ "<|24.80|>": 51604,
893
+ "<|24.82|>": 51605,
894
+ "<|24.84|>": 51606,
895
+ "<|24.86|>": 51607,
896
+ "<|24.88|>": 51608,
897
+ "<|24.90|>": 51609,
898
+ "<|24.92|>": 51610,
899
+ "<|24.94|>": 51611,
900
+ "<|24.96|>": 51612,
901
+ "<|24.98|>": 51613,
902
+ "<|25.00|>": 51614,
903
+ "<|25.02|>": 51615,
904
+ "<|25.04|>": 51616,
905
+ "<|25.06|>": 51617,
906
+ "<|25.08|>": 51618,
907
+ "<|25.10|>": 51619,
908
+ "<|25.12|>": 51620,
909
+ "<|25.14|>": 51621,
910
+ "<|25.16|>": 51622,
911
+ "<|25.18|>": 51623,
912
+ "<|25.20|>": 51624,
913
+ "<|25.22|>": 51625,
914
+ "<|25.24|>": 51626,
915
+ "<|25.26|>": 51627,
916
+ "<|25.28|>": 51628,
917
+ "<|25.30|>": 51629,
918
+ "<|25.32|>": 51630,
919
+ "<|25.34|>": 51631,
920
+ "<|25.36|>": 51632,
921
+ "<|25.38|>": 51633,
922
+ "<|25.40|>": 51634,
923
+ "<|25.42|>": 51635,
924
+ "<|25.44|>": 51636,
925
+ "<|25.46|>": 51637,
926
+ "<|25.48|>": 51638,
927
+ "<|25.50|>": 51639,
928
+ "<|25.52|>": 51640,
929
+ "<|25.54|>": 51641,
930
+ "<|25.56|>": 51642,
931
+ "<|25.58|>": 51643,
932
+ "<|25.60|>": 51644,
933
+ "<|25.62|>": 51645,
934
+ "<|25.64|>": 51646,
935
+ "<|25.66|>": 51647,
936
+ "<|25.68|>": 51648,
937
+ "<|25.70|>": 51649,
938
+ "<|25.72|>": 51650,
939
+ "<|25.74|>": 51651,
940
+ "<|25.76|>": 51652,
941
+ "<|25.78|>": 51653,
942
+ "<|25.80|>": 51654,
943
+ "<|25.82|>": 51655,
944
+ "<|25.84|>": 51656,
945
+ "<|25.86|>": 51657,
946
+ "<|25.88|>": 51658,
947
+ "<|25.90|>": 51659,
948
+ "<|25.92|>": 51660,
949
+ "<|25.94|>": 51661,
950
+ "<|25.96|>": 51662,
951
+ "<|25.98|>": 51663,
952
+ "<|26.00|>": 51664,
953
+ "<|26.02|>": 51665,
954
+ "<|26.04|>": 51666,
955
+ "<|26.06|>": 51667,
956
+ "<|26.08|>": 51668,
957
+ "<|26.10|>": 51669,
958
+ "<|26.12|>": 51670,
959
+ "<|26.14|>": 51671,
960
+ "<|26.16|>": 51672,
961
+ "<|26.18|>": 51673,
962
+ "<|26.20|>": 51674,
963
+ "<|26.22|>": 51675,
964
+ "<|26.24|>": 51676,
965
+ "<|26.26|>": 51677,
966
+ "<|26.28|>": 51678,
967
+ "<|26.30|>": 51679,
968
+ "<|26.32|>": 51680,
969
+ "<|26.34|>": 51681,
970
+ "<|26.36|>": 51682,
971
+ "<|26.38|>": 51683,
972
+ "<|26.40|>": 51684,
973
+ "<|26.42|>": 51685,
974
+ "<|26.44|>": 51686,
975
+ "<|26.46|>": 51687,
976
+ "<|26.48|>": 51688,
977
+ "<|26.50|>": 51689,
978
+ "<|26.52|>": 51690,
979
+ "<|26.54|>": 51691,
980
+ "<|26.56|>": 51692,
981
+ "<|26.58|>": 51693,
982
+ "<|26.60|>": 51694,
983
+ "<|26.62|>": 51695,
984
+ "<|26.64|>": 51696,
985
+ "<|26.66|>": 51697,
986
+ "<|26.68|>": 51698,
987
+ "<|26.70|>": 51699,
988
+ "<|26.72|>": 51700,
989
+ "<|26.74|>": 51701,
990
+ "<|26.76|>": 51702,
991
+ "<|26.78|>": 51703,
992
+ "<|26.80|>": 51704,
993
+ "<|26.82|>": 51705,
994
+ "<|26.84|>": 51706,
995
+ "<|26.86|>": 51707,
996
+ "<|26.88|>": 51708,
997
+ "<|26.90|>": 51709,
998
+ "<|26.92|>": 51710,
999
+ "<|26.94|>": 51711,
1000
+ "<|26.96|>": 51712,
1001
+ "<|26.98|>": 51713,
1002
+ "<|27.00|>": 51714,
1003
+ "<|27.02|>": 51715,
1004
+ "<|27.04|>": 51716,
1005
+ "<|27.06|>": 51717,
1006
+ "<|27.08|>": 51718,
1007
+ "<|27.10|>": 51719,
1008
+ "<|27.12|>": 51720,
1009
+ "<|27.14|>": 51721,
1010
+ "<|27.16|>": 51722,
1011
+ "<|27.18|>": 51723,
1012
+ "<|27.20|>": 51724,
1013
+ "<|27.22|>": 51725,
1014
+ "<|27.24|>": 51726,
1015
+ "<|27.26|>": 51727,
1016
+ "<|27.28|>": 51728,
1017
+ "<|27.30|>": 51729,
1018
+ "<|27.32|>": 51730,
1019
+ "<|27.34|>": 51731,
1020
+ "<|27.36|>": 51732,
1021
+ "<|27.38|>": 51733,
1022
+ "<|27.40|>": 51734,
1023
+ "<|27.42|>": 51735,
1024
+ "<|27.44|>": 51736,
1025
+ "<|27.46|>": 51737,
1026
+ "<|27.48|>": 51738,
1027
+ "<|27.50|>": 51739,
1028
+ "<|27.52|>": 51740,
1029
+ "<|27.54|>": 51741,
1030
+ "<|27.56|>": 51742,
1031
+ "<|27.58|>": 51743,
1032
+ "<|27.60|>": 51744,
1033
+ "<|27.62|>": 51745,
1034
+ "<|27.64|>": 51746,
1035
+ "<|27.66|>": 51747,
1036
+ "<|27.68|>": 51748,
1037
+ "<|27.70|>": 51749,
1038
+ "<|27.72|>": 51750,
1039
+ "<|27.74|>": 51751,
1040
+ "<|27.76|>": 51752,
1041
+ "<|27.78|>": 51753,
1042
+ "<|27.80|>": 51754,
1043
+ "<|27.82|>": 51755,
1044
+ "<|27.84|>": 51756,
1045
+ "<|27.86|>": 51757,
1046
+ "<|27.88|>": 51758,
1047
+ "<|27.90|>": 51759,
1048
+ "<|27.92|>": 51760,
1049
+ "<|27.94|>": 51761,
1050
+ "<|27.96|>": 51762,
1051
+ "<|27.98|>": 51763,
1052
+ "<|28.00|>": 51764,
1053
+ "<|28.02|>": 51765,
1054
+ "<|28.04|>": 51766,
1055
+ "<|28.06|>": 51767,
1056
+ "<|28.08|>": 51768,
1057
+ "<|28.10|>": 51769,
1058
+ "<|28.12|>": 51770,
1059
+ "<|28.14|>": 51771,
1060
+ "<|28.16|>": 51772,
1061
+ "<|28.18|>": 51773,
1062
+ "<|28.20|>": 51774,
1063
+ "<|28.22|>": 51775,
1064
+ "<|28.24|>": 51776,
1065
+ "<|28.26|>": 51777,
1066
+ "<|28.28|>": 51778,
1067
+ "<|28.30|>": 51779,
1068
+ "<|28.32|>": 51780,
1069
+ "<|28.34|>": 51781,
1070
+ "<|28.36|>": 51782,
1071
+ "<|28.38|>": 51783,
1072
+ "<|28.40|>": 51784,
1073
+ "<|28.42|>": 51785,
1074
+ "<|28.44|>": 51786,
1075
+ "<|28.46|>": 51787,
1076
+ "<|28.48|>": 51788,
1077
+ "<|28.50|>": 51789,
1078
+ "<|28.52|>": 51790,
1079
+ "<|28.54|>": 51791,
1080
+ "<|28.56|>": 51792,
1081
+ "<|28.58|>": 51793,
1082
+ "<|28.60|>": 51794,
1083
+ "<|28.62|>": 51795,
1084
+ "<|28.64|>": 51796,
1085
+ "<|28.66|>": 51797,
1086
+ "<|28.68|>": 51798,
1087
+ "<|28.70|>": 51799,
1088
+ "<|28.72|>": 51800,
1089
+ "<|28.74|>": 51801,
1090
+ "<|28.76|>": 51802,
1091
+ "<|28.78|>": 51803,
1092
+ "<|28.80|>": 51804,
1093
+ "<|28.82|>": 51805,
1094
+ "<|28.84|>": 51806,
1095
+ "<|28.86|>": 51807,
1096
+ "<|28.88|>": 51808,
1097
+ "<|28.90|>": 51809,
1098
+ "<|28.92|>": 51810,
1099
+ "<|28.94|>": 51811,
1100
+ "<|28.96|>": 51812,
1101
+ "<|28.98|>": 51813,
1102
+ "<|29.00|>": 51814,
1103
+ "<|29.02|>": 51815,
1104
+ "<|29.04|>": 51816,
1105
+ "<|29.06|>": 51817,
1106
+ "<|29.08|>": 51818,
1107
+ "<|29.10|>": 51819,
1108
+ "<|29.12|>": 51820,
1109
+ "<|29.14|>": 51821,
1110
+ "<|29.16|>": 51822,
1111
+ "<|29.18|>": 51823,
1112
+ "<|29.20|>": 51824,
1113
+ "<|29.22|>": 51825,
1114
+ "<|29.24|>": 51826,
1115
+ "<|29.26|>": 51827,
1116
+ "<|29.28|>": 51828,
1117
+ "<|29.30|>": 51829,
1118
+ "<|29.32|>": 51830,
1119
+ "<|29.34|>": 51831,
1120
+ "<|29.36|>": 51832,
1121
+ "<|29.38|>": 51833,
1122
+ "<|29.40|>": 51834,
1123
+ "<|29.42|>": 51835,
1124
+ "<|29.44|>": 51836,
1125
+ "<|29.46|>": 51837,
1126
+ "<|29.48|>": 51838,
1127
+ "<|29.50|>": 51839,
1128
+ "<|29.52|>": 51840,
1129
+ "<|29.54|>": 51841,
1130
+ "<|29.56|>": 51842,
1131
+ "<|29.58|>": 51843,
1132
+ "<|29.60|>": 51844,
1133
+ "<|29.62|>": 51845,
1134
+ "<|29.64|>": 51846,
1135
+ "<|29.66|>": 51847,
1136
+ "<|29.68|>": 51848,
1137
+ "<|29.70|>": 51849,
1138
+ "<|29.72|>": 51850,
1139
+ "<|29.74|>": 51851,
1140
+ "<|29.76|>": 51852,
1141
+ "<|29.78|>": 51853,
1142
+ "<|29.80|>": 51854,
1143
+ "<|29.82|>": 51855,
1144
+ "<|29.84|>": 51856,
1145
+ "<|29.86|>": 51857,
1146
+ "<|29.88|>": 51858,
1147
+ "<|29.90|>": 51859,
1148
+ "<|29.92|>": 51860,
1149
+ "<|29.94|>": 51861,
1150
+ "<|29.96|>": 51862,
1151
+ "<|29.98|>": 51863,
1152
+ "<|3.00|>": 50514,
1153
+ "<|3.02|>": 50515,
1154
+ "<|3.04|>": 50516,
1155
+ "<|3.06|>": 50517,
1156
+ "<|3.08|>": 50518,
1157
+ "<|3.10|>": 50519,
1158
+ "<|3.12|>": 50520,
1159
+ "<|3.14|>": 50521,
1160
+ "<|3.16|>": 50522,
1161
+ "<|3.18|>": 50523,
1162
+ "<|3.20|>": 50524,
1163
+ "<|3.22|>": 50525,
1164
+ "<|3.24|>": 50526,
1165
+ "<|3.26|>": 50527,
1166
+ "<|3.28|>": 50528,
1167
+ "<|3.30|>": 50529,
1168
+ "<|3.32|>": 50530,
1169
+ "<|3.34|>": 50531,
1170
+ "<|3.36|>": 50532,
1171
+ "<|3.38|>": 50533,
1172
+ "<|3.40|>": 50534,
1173
+ "<|3.42|>": 50535,
1174
+ "<|3.44|>": 50536,
1175
+ "<|3.46|>": 50537,
1176
+ "<|3.48|>": 50538,
1177
+ "<|3.50|>": 50539,
1178
+ "<|3.52|>": 50540,
1179
+ "<|3.54|>": 50541,
1180
+ "<|3.56|>": 50542,
1181
+ "<|3.58|>": 50543,
1182
+ "<|3.60|>": 50544,
1183
+ "<|3.62|>": 50545,
1184
+ "<|3.64|>": 50546,
1185
+ "<|3.66|>": 50547,
1186
+ "<|3.68|>": 50548,
1187
+ "<|3.70|>": 50549,
1188
+ "<|3.72|>": 50550,
1189
+ "<|3.74|>": 50551,
1190
+ "<|3.76|>": 50552,
1191
+ "<|3.78|>": 50553,
1192
+ "<|3.80|>": 50554,
1193
+ "<|3.82|>": 50555,
1194
+ "<|3.84|>": 50556,
1195
+ "<|3.86|>": 50557,
1196
+ "<|3.88|>": 50558,
1197
+ "<|3.90|>": 50559,
1198
+ "<|3.92|>": 50560,
1199
+ "<|3.94|>": 50561,
1200
+ "<|3.96|>": 50562,
1201
+ "<|3.98|>": 50563,
1202
+ "<|30.00|>": 51864,
1203
+ "<|4.00|>": 50564,
1204
+ "<|4.02|>": 50565,
1205
+ "<|4.04|>": 50566,
1206
+ "<|4.06|>": 50567,
1207
+ "<|4.08|>": 50568,
1208
+ "<|4.10|>": 50569,
1209
+ "<|4.12|>": 50570,
1210
+ "<|4.14|>": 50571,
1211
+ "<|4.16|>": 50572,
1212
+ "<|4.18|>": 50573,
1213
+ "<|4.20|>": 50574,
1214
+ "<|4.22|>": 50575,
1215
+ "<|4.24|>": 50576,
1216
+ "<|4.26|>": 50577,
1217
+ "<|4.28|>": 50578,
1218
+ "<|4.30|>": 50579,
1219
+ "<|4.32|>": 50580,
1220
+ "<|4.34|>": 50581,
1221
+ "<|4.36|>": 50582,
1222
+ "<|4.38|>": 50583,
1223
+ "<|4.40|>": 50584,
1224
+ "<|4.42|>": 50585,
1225
+ "<|4.44|>": 50586,
1226
+ "<|4.46|>": 50587,
1227
+ "<|4.48|>": 50588,
1228
+ "<|4.50|>": 50589,
1229
+ "<|4.52|>": 50590,
1230
+ "<|4.54|>": 50591,
1231
+ "<|4.56|>": 50592,
1232
+ "<|4.58|>": 50593,
1233
+ "<|4.60|>": 50594,
1234
+ "<|4.62|>": 50595,
1235
+ "<|4.64|>": 50596,
1236
+ "<|4.66|>": 50597,
1237
+ "<|4.68|>": 50598,
1238
+ "<|4.70|>": 50599,
1239
+ "<|4.72|>": 50600,
1240
+ "<|4.74|>": 50601,
1241
+ "<|4.76|>": 50602,
1242
+ "<|4.78|>": 50603,
1243
+ "<|4.80|>": 50604,
1244
+ "<|4.82|>": 50605,
1245
+ "<|4.84|>": 50606,
1246
+ "<|4.86|>": 50607,
1247
+ "<|4.88|>": 50608,
1248
+ "<|4.90|>": 50609,
1249
+ "<|4.92|>": 50610,
1250
+ "<|4.94|>": 50611,
1251
+ "<|4.96|>": 50612,
1252
+ "<|4.98|>": 50613,
1253
+ "<|5.00|>": 50614,
1254
+ "<|5.02|>": 50615,
1255
+ "<|5.04|>": 50616,
1256
+ "<|5.06|>": 50617,
1257
+ "<|5.08|>": 50618,
1258
+ "<|5.10|>": 50619,
1259
+ "<|5.12|>": 50620,
1260
+ "<|5.14|>": 50621,
1261
+ "<|5.16|>": 50622,
1262
+ "<|5.18|>": 50623,
1263
+ "<|5.20|>": 50624,
1264
+ "<|5.22|>": 50625,
1265
+ "<|5.24|>": 50626,
1266
+ "<|5.26|>": 50627,
1267
+ "<|5.28|>": 50628,
1268
+ "<|5.30|>": 50629,
1269
+ "<|5.32|>": 50630,
1270
+ "<|5.34|>": 50631,
1271
+ "<|5.36|>": 50632,
1272
+ "<|5.38|>": 50633,
1273
+ "<|5.40|>": 50634,
1274
+ "<|5.42|>": 50635,
1275
+ "<|5.44|>": 50636,
1276
+ "<|5.46|>": 50637,
1277
+ "<|5.48|>": 50638,
1278
+ "<|5.50|>": 50639,
1279
+ "<|5.52|>": 50640,
1280
+ "<|5.54|>": 50641,
1281
+ "<|5.56|>": 50642,
1282
+ "<|5.58|>": 50643,
1283
+ "<|5.60|>": 50644,
1284
+ "<|5.62|>": 50645,
1285
+ "<|5.64|>": 50646,
1286
+ "<|5.66|>": 50647,
1287
+ "<|5.68|>": 50648,
1288
+ "<|5.70|>": 50649,
1289
+ "<|5.72|>": 50650,
1290
+ "<|5.74|>": 50651,
1291
+ "<|5.76|>": 50652,
1292
+ "<|5.78|>": 50653,
1293
+ "<|5.80|>": 50654,
1294
+ "<|5.82|>": 50655,
1295
+ "<|5.84|>": 50656,
1296
+ "<|5.86|>": 50657,
1297
+ "<|5.88|>": 50658,
1298
+ "<|5.90|>": 50659,
1299
+ "<|5.92|>": 50660,
1300
+ "<|5.94|>": 50661,
1301
+ "<|5.96|>": 50662,
1302
+ "<|5.98|>": 50663,
1303
+ "<|6.00|>": 50664,
1304
+ "<|6.02|>": 50665,
1305
+ "<|6.04|>": 50666,
1306
+ "<|6.06|>": 50667,
1307
+ "<|6.08|>": 50668,
1308
+ "<|6.10|>": 50669,
1309
+ "<|6.12|>": 50670,
1310
+ "<|6.14|>": 50671,
1311
+ "<|6.16|>": 50672,
1312
+ "<|6.18|>": 50673,
1313
+ "<|6.20|>": 50674,
1314
+ "<|6.22|>": 50675,
1315
+ "<|6.24|>": 50676,
1316
+ "<|6.26|>": 50677,
1317
+ "<|6.28|>": 50678,
1318
+ "<|6.30|>": 50679,
1319
+ "<|6.32|>": 50680,
1320
+ "<|6.34|>": 50681,
1321
+ "<|6.36|>": 50682,
1322
+ "<|6.38|>": 50683,
1323
+ "<|6.40|>": 50684,
1324
+ "<|6.42|>": 50685,
1325
+ "<|6.44|>": 50686,
1326
+ "<|6.46|>": 50687,
1327
+ "<|6.48|>": 50688,
1328
+ "<|6.50|>": 50689,
1329
+ "<|6.52|>": 50690,
1330
+ "<|6.54|>": 50691,
1331
+ "<|6.56|>": 50692,
1332
+ "<|6.58|>": 50693,
1333
+ "<|6.60|>": 50694,
1334
+ "<|6.62|>": 50695,
1335
+ "<|6.64|>": 50696,
1336
+ "<|6.66|>": 50697,
1337
+ "<|6.68|>": 50698,
1338
+ "<|6.70|>": 50699,
1339
+ "<|6.72|>": 50700,
1340
+ "<|6.74|>": 50701,
1341
+ "<|6.76|>": 50702,
1342
+ "<|6.78|>": 50703,
1343
+ "<|6.80|>": 50704,
1344
+ "<|6.82|>": 50705,
1345
+ "<|6.84|>": 50706,
1346
+ "<|6.86|>": 50707,
1347
+ "<|6.88|>": 50708,
1348
+ "<|6.90|>": 50709,
1349
+ "<|6.92|>": 50710,
1350
+ "<|6.94|>": 50711,
1351
+ "<|6.96|>": 50712,
1352
+ "<|6.98|>": 50713,
1353
+ "<|7.00|>": 50714,
1354
+ "<|7.02|>": 50715,
1355
+ "<|7.04|>": 50716,
1356
+ "<|7.06|>": 50717,
1357
+ "<|7.08|>": 50718,
1358
+ "<|7.10|>": 50719,
1359
+ "<|7.12|>": 50720,
1360
+ "<|7.14|>": 50721,
1361
+ "<|7.16|>": 50722,
1362
+ "<|7.18|>": 50723,
1363
+ "<|7.20|>": 50724,
1364
+ "<|7.22|>": 50725,
1365
+ "<|7.24|>": 50726,
1366
+ "<|7.26|>": 50727,
1367
+ "<|7.28|>": 50728,
1368
+ "<|7.30|>": 50729,
1369
+ "<|7.32|>": 50730,
1370
+ "<|7.34|>": 50731,
1371
+ "<|7.36|>": 50732,
1372
+ "<|7.38|>": 50733,
1373
+ "<|7.40|>": 50734,
1374
+ "<|7.42|>": 50735,
1375
+ "<|7.44|>": 50736,
1376
+ "<|7.46|>": 50737,
1377
+ "<|7.48|>": 50738,
1378
+ "<|7.50|>": 50739,
1379
+ "<|7.52|>": 50740,
1380
+ "<|7.54|>": 50741,
1381
+ "<|7.56|>": 50742,
1382
+ "<|7.58|>": 50743,
1383
+ "<|7.60|>": 50744,
1384
+ "<|7.62|>": 50745,
1385
+ "<|7.64|>": 50746,
1386
+ "<|7.66|>": 50747,
1387
+ "<|7.68|>": 50748,
1388
+ "<|7.70|>": 50749,
1389
+ "<|7.72|>": 50750,
1390
+ "<|7.74|>": 50751,
1391
+ "<|7.76|>": 50752,
1392
+ "<|7.78|>": 50753,
1393
+ "<|7.80|>": 50754,
1394
+ "<|7.82|>": 50755,
1395
+ "<|7.84|>": 50756,
1396
+ "<|7.86|>": 50757,
1397
+ "<|7.88|>": 50758,
1398
+ "<|7.90|>": 50759,
1399
+ "<|7.92|>": 50760,
1400
+ "<|7.94|>": 50761,
1401
+ "<|7.96|>": 50762,
1402
+ "<|7.98|>": 50763,
1403
+ "<|8.00|>": 50764,
1404
+ "<|8.02|>": 50765,
1405
+ "<|8.04|>": 50766,
1406
+ "<|8.06|>": 50767,
1407
+ "<|8.08|>": 50768,
1408
+ "<|8.10|>": 50769,
1409
+ "<|8.12|>": 50770,
1410
+ "<|8.14|>": 50771,
1411
+ "<|8.16|>": 50772,
1412
+ "<|8.18|>": 50773,
1413
+ "<|8.20|>": 50774,
1414
+ "<|8.22|>": 50775,
1415
+ "<|8.24|>": 50776,
1416
+ "<|8.26|>": 50777,
1417
+ "<|8.28|>": 50778,
1418
+ "<|8.30|>": 50779,
1419
+ "<|8.32|>": 50780,
1420
+ "<|8.34|>": 50781,
1421
+ "<|8.36|>": 50782,
1422
+ "<|8.38|>": 50783,
1423
+ "<|8.40|>": 50784,
1424
+ "<|8.42|>": 50785,
1425
+ "<|8.44|>": 50786,
1426
+ "<|8.46|>": 50787,
1427
+ "<|8.48|>": 50788,
1428
+ "<|8.50|>": 50789,
1429
+ "<|8.52|>": 50790,
1430
+ "<|8.54|>": 50791,
1431
+ "<|8.56|>": 50792,
1432
+ "<|8.58|>": 50793,
1433
+ "<|8.60|>": 50794,
1434
+ "<|8.62|>": 50795,
1435
+ "<|8.64|>": 50796,
1436
+ "<|8.66|>": 50797,
1437
+ "<|8.68|>": 50798,
1438
+ "<|8.70|>": 50799,
1439
+ "<|8.72|>": 50800,
1440
+ "<|8.74|>": 50801,
1441
+ "<|8.76|>": 50802,
1442
+ "<|8.78|>": 50803,
1443
+ "<|8.80|>": 50804,
1444
+ "<|8.82|>": 50805,
1445
+ "<|8.84|>": 50806,
1446
+ "<|8.86|>": 50807,
1447
+ "<|8.88|>": 50808,
1448
+ "<|8.90|>": 50809,
1449
+ "<|8.92|>": 50810,
1450
+ "<|8.94|>": 50811,
1451
+ "<|8.96|>": 50812,
1452
+ "<|8.98|>": 50813,
1453
+ "<|9.00|>": 50814,
1454
+ "<|9.02|>": 50815,
1455
+ "<|9.04|>": 50816,
1456
+ "<|9.06|>": 50817,
1457
+ "<|9.08|>": 50818,
1458
+ "<|9.10|>": 50819,
1459
+ "<|9.12|>": 50820,
1460
+ "<|9.14|>": 50821,
1461
+ "<|9.16|>": 50822,
1462
+ "<|9.18|>": 50823,
1463
+ "<|9.20|>": 50824,
1464
+ "<|9.22|>": 50825,
1465
+ "<|9.24|>": 50826,
1466
+ "<|9.26|>": 50827,
1467
+ "<|9.28|>": 50828,
1468
+ "<|9.30|>": 50829,
1469
+ "<|9.32|>": 50830,
1470
+ "<|9.34|>": 50831,
1471
+ "<|9.36|>": 50832,
1472
+ "<|9.38|>": 50833,
1473
+ "<|9.40|>": 50834,
1474
+ "<|9.42|>": 50835,
1475
+ "<|9.44|>": 50836,
1476
+ "<|9.46|>": 50837,
1477
+ "<|9.48|>": 50838,
1478
+ "<|9.50|>": 50839,
1479
+ "<|9.52|>": 50840,
1480
+ "<|9.54|>": 50841,
1481
+ "<|9.56|>": 50842,
1482
+ "<|9.58|>": 50843,
1483
+ "<|9.60|>": 50844,
1484
+ "<|9.62|>": 50845,
1485
+ "<|9.64|>": 50846,
1486
+ "<|9.66|>": 50847,
1487
+ "<|9.68|>": 50848,
1488
+ "<|9.70|>": 50849,
1489
+ "<|9.72|>": 50850,
1490
+ "<|9.74|>": 50851,
1491
+ "<|9.76|>": 50852,
1492
+ "<|9.78|>": 50853,
1493
+ "<|9.80|>": 50854,
1494
+ "<|9.82|>": 50855,
1495
+ "<|9.84|>": 50856,
1496
+ "<|9.86|>": 50857,
1497
+ "<|9.88|>": 50858,
1498
+ "<|9.90|>": 50859,
1499
+ "<|9.92|>": 50860,
1500
+ "<|9.94|>": 50861,
1501
+ "<|9.96|>": 50862,
1502
+ "<|9.98|>": 50863,
1503
+ "<|af|>": 50327,
1504
+ "<|am|>": 50334,
1505
+ "<|ar|>": 50272,
1506
+ "<|as|>": 50350,
1507
+ "<|az|>": 50304,
1508
+ "<|ba|>": 50355,
1509
+ "<|be|>": 50330,
1510
+ "<|bg|>": 50292,
1511
+ "<|bn|>": 50302,
1512
+ "<|bo|>": 50347,
1513
+ "<|br|>": 50309,
1514
+ "<|bs|>": 50315,
1515
+ "<|ca|>": 50270,
1516
+ "<|cs|>": 50283,
1517
+ "<|cy|>": 50297,
1518
+ "<|da|>": 50285,
1519
+ "<|de|>": 50261,
1520
+ "<|el|>": 50281,
1521
+ "<|en|>": 50259,
1522
+ "<|es|>": 50262,
1523
+ "<|et|>": 50307,
1524
+ "<|eu|>": 50310,
1525
+ "<|fa|>": 50300,
1526
+ "<|fi|>": 50277,
1527
+ "<|fo|>": 50338,
1528
+ "<|fr|>": 50265,
1529
+ "<|gl|>": 50319,
1530
+ "<|gu|>": 50333,
1531
+ "<|haw|>": 50352,
1532
+ "<|ha|>": 50354,
1533
+ "<|he|>": 50279,
1534
+ "<|hi|>": 50276,
1535
+ "<|hr|>": 50291,
1536
+ "<|ht|>": 50339,
1537
+ "<|hu|>": 50286,
1538
+ "<|hy|>": 50312,
1539
+ "<|id|>": 50275,
1540
+ "<|is|>": 50311,
1541
+ "<|it|>": 50274,
1542
+ "<|ja|>": 50266,
1543
+ "<|jw|>": 50356,
1544
+ "<|ka|>": 50329,
1545
+ "<|kk|>": 50316,
1546
+ "<|km|>": 50323,
1547
+ "<|kn|>": 50306,
1548
+ "<|ko|>": 50264,
1549
+ "<|la|>": 50294,
1550
+ "<|lb|>": 50345,
1551
+ "<|ln|>": 50353,
1552
+ "<|lo|>": 50336,
1553
+ "<|lt|>": 50293,
1554
+ "<|lv|>": 50301,
1555
+ "<|mg|>": 50349,
1556
+ "<|mi|>": 50295,
1557
+ "<|mk|>": 50308,
1558
+ "<|ml|>": 50296,
1559
+ "<|mn|>": 50314,
1560
+ "<|mr|>": 50320,
1561
+ "<|ms|>": 50282,
1562
+ "<|mt|>": 50343,
1563
+ "<|my|>": 50346,
1564
+ "<|ne|>": 50313,
1565
+ "<|nl|>": 50271,
1566
+ "<|nn|>": 50342,
1567
+ "<|nocaptions|>": 50362,
1568
+ "<|notimestamps|>": 50363,
1569
+ "<|no|>": 50288,
1570
+ "<|oc|>": 50328,
1571
+ "<|pa|>": 50321,
1572
+ "<|pl|>": 50269,
1573
+ "<|ps|>": 50340,
1574
+ "<|pt|>": 50267,
1575
+ "<|ro|>": 50284,
1576
+ "<|ru|>": 50263,
1577
+ "<|sa|>": 50344,
1578
+ "<|sd|>": 50332,
1579
+ "<|si|>": 50322,
1580
+ "<|sk|>": 50298,
1581
+ "<|sl|>": 50305,
1582
+ "<|sn|>": 50324,
1583
+ "<|so|>": 50326,
1584
+ "<|sq|>": 50317,
1585
+ "<|sr|>": 50303,
1586
+ "<|startoflm|>": 50360,
1587
+ "<|startofprev|>": 50361,
1588
+ "<|startoftranscript|>": 50258,
1589
+ "<|su|>": 50357,
1590
+ "<|sv|>": 50273,
1591
+ "<|sw|>": 50318,
1592
+ "<|ta|>": 50287,
1593
+ "<|te|>": 50299,
1594
+ "<|tg|>": 50331,
1595
+ "<|th|>": 50289,
1596
+ "<|tk|>": 50341,
1597
+ "<|tl|>": 50348,
1598
+ "<|transcribe|>": 50359,
1599
+ "<|translate|>": 50358,
1600
+ "<|tr|>": 50268,
1601
+ "<|tt|>": 50351,
1602
+ "<|uk|>": 50280,
1603
+ "<|ur|>": 50290,
1604
+ "<|uz|>": 50337,
1605
+ "<|vi|>": 50278,
1606
+ "<|yi|>": 50335,
1607
+ "<|yo|>": 50325,
1608
+ "<|zh|>": 50260
1609
+ }
checkpoint-1000/config.json ADDED
@@ -0,0 +1,60 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "activation_dropout": 0.0,
3
+ "activation_function": "gelu",
4
+ "apply_spec_augment": false,
5
+ "architectures": [
6
+ "WhisperForConditionalGeneration"
7
+ ],
8
+ "attention_dropout": 0.0,
9
+ "begin_suppress_tokens": null,
10
+ "bos_token_id": 50257,
11
+ "classifier_proj_size": 256,
12
+ "d_model": 768,
13
+ "decoder_attention_heads": 12,
14
+ "decoder_ffn_dim": 3072,
15
+ "decoder_layerdrop": 0.0,
16
+ "decoder_layers": 12,
17
+ "decoder_start_token_id": 50258,
18
+ "dropout": 0.0,
19
+ "encoder_attention_heads": 12,
20
+ "encoder_ffn_dim": 3072,
21
+ "encoder_layerdrop": 0.0,
22
+ "encoder_layers": 12,
23
+ "eos_token_id": 50257,
24
+ "forced_decoder_ids": [
25
+ [
26
+ 1,
27
+ 50259
28
+ ],
29
+ [
30
+ 2,
31
+ 50359
32
+ ],
33
+ [
34
+ 3,
35
+ 50363
36
+ ]
37
+ ],
38
+ "init_std": 0.02,
39
+ "is_encoder_decoder": true,
40
+ "mask_feature_length": 10,
41
+ "mask_feature_min_masks": 0,
42
+ "mask_feature_prob": 0.0,
43
+ "mask_time_length": 10,
44
+ "mask_time_min_masks": 2,
45
+ "mask_time_prob": 0.05,
46
+ "max_length": null,
47
+ "max_source_positions": 1500,
48
+ "max_target_positions": 448,
49
+ "median_filter_width": 7,
50
+ "model_type": "whisper",
51
+ "num_hidden_layers": 12,
52
+ "num_mel_bins": 80,
53
+ "pad_token_id": 50257,
54
+ "scale_embedding": false,
55
+ "torch_dtype": "float32",
56
+ "transformers_version": "4.51.3",
57
+ "use_cache": true,
58
+ "use_weighted_layer_sum": false,
59
+ "vocab_size": 51865
60
+ }
checkpoint-1000/generation_config.json ADDED
@@ -0,0 +1,254 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alignment_heads": [
3
+ [
4
+ 5,
5
+ 3
6
+ ],
7
+ [
8
+ 5,
9
+ 9
10
+ ],
11
+ [
12
+ 8,
13
+ 0
14
+ ],
15
+ [
16
+ 8,
17
+ 4
18
+ ],
19
+ [
20
+ 8,
21
+ 7
22
+ ],
23
+ [
24
+ 8,
25
+ 8
26
+ ],
27
+ [
28
+ 9,
29
+ 0
30
+ ],
31
+ [
32
+ 9,
33
+ 7
34
+ ],
35
+ [
36
+ 9,
37
+ 9
38
+ ],
39
+ [
40
+ 10,
41
+ 5
42
+ ]
43
+ ],
44
+ "begin_suppress_tokens": [
45
+ 220,
46
+ 50257
47
+ ],
48
+ "bos_token_id": 50257,
49
+ "decoder_start_token_id": 50258,
50
+ "eos_token_id": 50257,
51
+ "is_multilingual": true,
52
+ "lang_to_id": {
53
+ "<|af|>": 50327,
54
+ "<|am|>": 50334,
55
+ "<|ar|>": 50272,
56
+ "<|as|>": 50350,
57
+ "<|az|>": 50304,
58
+ "<|ba|>": 50355,
59
+ "<|be|>": 50330,
60
+ "<|bg|>": 50292,
61
+ "<|bn|>": 50302,
62
+ "<|bo|>": 50347,
63
+ "<|br|>": 50309,
64
+ "<|bs|>": 50315,
65
+ "<|ca|>": 50270,
66
+ "<|cs|>": 50283,
67
+ "<|cy|>": 50297,
68
+ "<|da|>": 50285,
69
+ "<|de|>": 50261,
70
+ "<|el|>": 50281,
71
+ "<|en|>": 50259,
72
+ "<|es|>": 50262,
73
+ "<|et|>": 50307,
74
+ "<|eu|>": 50310,
75
+ "<|fa|>": 50300,
76
+ "<|fi|>": 50277,
77
+ "<|fo|>": 50338,
78
+ "<|fr|>": 50265,
79
+ "<|gl|>": 50319,
80
+ "<|gu|>": 50333,
81
+ "<|haw|>": 50352,
82
+ "<|ha|>": 50354,
83
+ "<|he|>": 50279,
84
+ "<|hi|>": 50276,
85
+ "<|hr|>": 50291,
86
+ "<|ht|>": 50339,
87
+ "<|hu|>": 50286,
88
+ "<|hy|>": 50312,
89
+ "<|id|>": 50275,
90
+ "<|is|>": 50311,
91
+ "<|it|>": 50274,
92
+ "<|ja|>": 50266,
93
+ "<|jw|>": 50356,
94
+ "<|ka|>": 50329,
95
+ "<|kk|>": 50316,
96
+ "<|km|>": 50323,
97
+ "<|kn|>": 50306,
98
+ "<|ko|>": 50264,
99
+ "<|la|>": 50294,
100
+ "<|lb|>": 50345,
101
+ "<|ln|>": 50353,
102
+ "<|lo|>": 50336,
103
+ "<|lt|>": 50293,
104
+ "<|lv|>": 50301,
105
+ "<|mg|>": 50349,
106
+ "<|mi|>": 50295,
107
+ "<|mk|>": 50308,
108
+ "<|ml|>": 50296,
109
+ "<|mn|>": 50314,
110
+ "<|mr|>": 50320,
111
+ "<|ms|>": 50282,
112
+ "<|mt|>": 50343,
113
+ "<|my|>": 50346,
114
+ "<|ne|>": 50313,
115
+ "<|nl|>": 50271,
116
+ "<|nn|>": 50342,
117
+ "<|no|>": 50288,
118
+ "<|oc|>": 50328,
119
+ "<|pa|>": 50321,
120
+ "<|pl|>": 50269,
121
+ "<|ps|>": 50340,
122
+ "<|pt|>": 50267,
123
+ "<|ro|>": 50284,
124
+ "<|ru|>": 50263,
125
+ "<|sa|>": 50344,
126
+ "<|sd|>": 50332,
127
+ "<|si|>": 50322,
128
+ "<|sk|>": 50298,
129
+ "<|sl|>": 50305,
130
+ "<|sn|>": 50324,
131
+ "<|so|>": 50326,
132
+ "<|sq|>": 50317,
133
+ "<|sr|>": 50303,
134
+ "<|su|>": 50357,
135
+ "<|sv|>": 50273,
136
+ "<|sw|>": 50318,
137
+ "<|ta|>": 50287,
138
+ "<|te|>": 50299,
139
+ "<|tg|>": 50331,
140
+ "<|th|>": 50289,
141
+ "<|tk|>": 50341,
142
+ "<|tl|>": 50348,
143
+ "<|tr|>": 50268,
144
+ "<|tt|>": 50351,
145
+ "<|uk|>": 50280,
146
+ "<|ur|>": 50290,
147
+ "<|uz|>": 50337,
148
+ "<|vi|>": 50278,
149
+ "<|yi|>": 50335,
150
+ "<|yo|>": 50325,
151
+ "<|zh|>": 50260
152
+ },
153
+ "language": "chinese",
154
+ "max_initial_timestamp_index": 50,
155
+ "max_length": 448,
156
+ "no_timestamps_token_id": 50363,
157
+ "pad_token_id": 50257,
158
+ "prev_sot_token_id": 50361,
159
+ "return_timestamps": false,
160
+ "suppress_tokens": [
161
+ 1,
162
+ 2,
163
+ 7,
164
+ 8,
165
+ 9,
166
+ 10,
167
+ 14,
168
+ 25,
169
+ 26,
170
+ 27,
171
+ 28,
172
+ 29,
173
+ 31,
174
+ 58,
175
+ 59,
176
+ 60,
177
+ 61,
178
+ 62,
179
+ 63,
180
+ 90,
181
+ 91,
182
+ 92,
183
+ 93,
184
+ 359,
185
+ 503,
186
+ 522,
187
+ 542,
188
+ 873,
189
+ 893,
190
+ 902,
191
+ 918,
192
+ 922,
193
+ 931,
194
+ 1350,
195
+ 1853,
196
+ 1982,
197
+ 2460,
198
+ 2627,
199
+ 3246,
200
+ 3253,
201
+ 3268,
202
+ 3536,
203
+ 3846,
204
+ 3961,
205
+ 4183,
206
+ 4667,
207
+ 6585,
208
+ 6647,
209
+ 7273,
210
+ 9061,
211
+ 9383,
212
+ 10428,
213
+ 10929,
214
+ 11938,
215
+ 12033,
216
+ 12331,
217
+ 12562,
218
+ 13793,
219
+ 14157,
220
+ 14635,
221
+ 15265,
222
+ 15618,
223
+ 16553,
224
+ 16604,
225
+ 18362,
226
+ 18956,
227
+ 20075,
228
+ 21675,
229
+ 22520,
230
+ 26130,
231
+ 26161,
232
+ 26435,
233
+ 28279,
234
+ 29464,
235
+ 31650,
236
+ 32302,
237
+ 32470,
238
+ 36865,
239
+ 42863,
240
+ 47425,
241
+ 49870,
242
+ 50254,
243
+ 50258,
244
+ 50360,
245
+ 50361,
246
+ 50362
247
+ ],
248
+ "task": "transcribe",
249
+ "task_to_id": {
250
+ "transcribe": 50359,
251
+ "translate": 50358
252
+ },
253
+ "transformers_version": "4.51.3"
254
+ }
checkpoint-1000/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0a8a08fe40fefd33cb1cb64ae5d293bf10df962aa69cec6c9f46f4f4978d4f5b
3
+ size 966995080
checkpoint-1000/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c3c5cedad0cb5c75271960ddc8738cc3eacb8806a3a30abb336e5e2b442a893f
3
+ size 1925064044
checkpoint-1000/preprocessor_config.json ADDED
@@ -0,0 +1,15 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "chunk_length": 30,
3
+ "dither": 0.0,
4
+ "feature_extractor_type": "WhisperFeatureExtractor",
5
+ "feature_size": 80,
6
+ "hop_length": 160,
7
+ "n_fft": 400,
8
+ "n_samples": 480000,
9
+ "nb_max_frames": 3000,
10
+ "padding_side": "right",
11
+ "padding_value": 0.0,
12
+ "processor_class": "WhisperProcessor",
13
+ "return_attention_mask": false,
14
+ "sampling_rate": 16000
15
+ }
checkpoint-1000/rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:77a953c1b48670da1bc7db6257f23ac91e47f20593c5406b86aa1a7c77f1bd47
3
+ size 14244
checkpoint-1000/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a51738ca0af55e803ac3fdbc0e3b67846eda4ca13018dd2960c216586e47c984
3
+ size 1064
checkpoint-1000/trainer_state.json ADDED
@@ -0,0 +1,323 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_global_step": 1000,
3
+ "best_metric": 84.01915040370025,
4
+ "best_model_checkpoint": "./working_area/output_model/checkpoint-1000",
5
+ "epoch": 0.16173378618793466,
6
+ "eval_steps": 1000,
7
+ "global_step": 1000,
8
+ "is_hyper_param_search": false,
9
+ "is_local_process_zero": true,
10
+ "is_world_process_zero": true,
11
+ "log_history": [
12
+ {
13
+ "epoch": 0.004043344654698367,
14
+ "grad_norm": 10.67656421661377,
15
+ "learning_rate": 4.800000000000001e-07,
16
+ "loss": 0.2728,
17
+ "step": 25
18
+ },
19
+ {
20
+ "epoch": 0.008086689309396733,
21
+ "grad_norm": 0.34478363394737244,
22
+ "learning_rate": 9.800000000000001e-07,
23
+ "loss": 0.0358,
24
+ "step": 50
25
+ },
26
+ {
27
+ "epoch": 0.012130033964095099,
28
+ "grad_norm": 0.26953190565109253,
29
+ "learning_rate": 1.48e-06,
30
+ "loss": 0.0261,
31
+ "step": 75
32
+ },
33
+ {
34
+ "epoch": 0.016173378618793467,
35
+ "grad_norm": 0.2419331669807434,
36
+ "learning_rate": 1.98e-06,
37
+ "loss": 0.0251,
38
+ "step": 100
39
+ },
40
+ {
41
+ "epoch": 0.020216723273491832,
42
+ "grad_norm": 0.29993513226509094,
43
+ "learning_rate": 2.4800000000000004e-06,
44
+ "loss": 0.0247,
45
+ "step": 125
46
+ },
47
+ {
48
+ "epoch": 0.024260067928190198,
49
+ "grad_norm": 0.2362569123506546,
50
+ "learning_rate": 2.9800000000000003e-06,
51
+ "loss": 0.0227,
52
+ "step": 150
53
+ },
54
+ {
55
+ "epoch": 0.028303412582888564,
56
+ "grad_norm": 0.2520463466644287,
57
+ "learning_rate": 3.48e-06,
58
+ "loss": 0.0246,
59
+ "step": 175
60
+ },
61
+ {
62
+ "epoch": 0.03234675723758693,
63
+ "grad_norm": 0.2729567885398865,
64
+ "learning_rate": 3.980000000000001e-06,
65
+ "loss": 0.0236,
66
+ "step": 200
67
+ },
68
+ {
69
+ "epoch": 0.036390101892285295,
70
+ "grad_norm": 0.21665456891059875,
71
+ "learning_rate": 4.48e-06,
72
+ "loss": 0.0242,
73
+ "step": 225
74
+ },
75
+ {
76
+ "epoch": 0.040433446546983665,
77
+ "grad_norm": 0.27291786670684814,
78
+ "learning_rate": 4.980000000000001e-06,
79
+ "loss": 0.0242,
80
+ "step": 250
81
+ },
82
+ {
83
+ "epoch": 0.044476791201682034,
84
+ "grad_norm": 0.20698551833629608,
85
+ "learning_rate": 5.480000000000001e-06,
86
+ "loss": 0.0229,
87
+ "step": 275
88
+ },
89
+ {
90
+ "epoch": 0.048520135856380396,
91
+ "grad_norm": 0.2506471276283264,
92
+ "learning_rate": 5.98e-06,
93
+ "loss": 0.0225,
94
+ "step": 300
95
+ },
96
+ {
97
+ "epoch": 0.052563480511078765,
98
+ "grad_norm": 0.24540211260318756,
99
+ "learning_rate": 6.480000000000001e-06,
100
+ "loss": 0.0231,
101
+ "step": 325
102
+ },
103
+ {
104
+ "epoch": 0.05660682516577713,
105
+ "grad_norm": 0.2298947423696518,
106
+ "learning_rate": 6.98e-06,
107
+ "loss": 0.0223,
108
+ "step": 350
109
+ },
110
+ {
111
+ "epoch": 0.0606501698204755,
112
+ "grad_norm": 0.19595475494861603,
113
+ "learning_rate": 7.48e-06,
114
+ "loss": 0.022,
115
+ "step": 375
116
+ },
117
+ {
118
+ "epoch": 0.06469351447517387,
119
+ "grad_norm": 0.2605821490287781,
120
+ "learning_rate": 7.980000000000002e-06,
121
+ "loss": 0.022,
122
+ "step": 400
123
+ },
124
+ {
125
+ "epoch": 0.06873685912987224,
126
+ "grad_norm": 0.25346869230270386,
127
+ "learning_rate": 8.48e-06,
128
+ "loss": 0.0226,
129
+ "step": 425
130
+ },
131
+ {
132
+ "epoch": 0.07278020378457059,
133
+ "grad_norm": 0.255500465631485,
134
+ "learning_rate": 8.98e-06,
135
+ "loss": 0.0216,
136
+ "step": 450
137
+ },
138
+ {
139
+ "epoch": 0.07682354843926896,
140
+ "grad_norm": 0.24462725222110748,
141
+ "learning_rate": 9.48e-06,
142
+ "loss": 0.0234,
143
+ "step": 475
144
+ },
145
+ {
146
+ "epoch": 0.08086689309396733,
147
+ "grad_norm": 0.23454348742961884,
148
+ "learning_rate": 9.980000000000001e-06,
149
+ "loss": 0.024,
150
+ "step": 500
151
+ },
152
+ {
153
+ "epoch": 0.0849102377486657,
154
+ "grad_norm": 0.2627425193786621,
155
+ "learning_rate": 9.946666666666667e-06,
156
+ "loss": 0.0226,
157
+ "step": 525
158
+ },
159
+ {
160
+ "epoch": 0.08895358240336407,
161
+ "grad_norm": 0.22754116356372833,
162
+ "learning_rate": 9.891111111111113e-06,
163
+ "loss": 0.023,
164
+ "step": 550
165
+ },
166
+ {
167
+ "epoch": 0.09299692705806242,
168
+ "grad_norm": 0.26447415351867676,
169
+ "learning_rate": 9.835555555555556e-06,
170
+ "loss": 0.0215,
171
+ "step": 575
172
+ },
173
+ {
174
+ "epoch": 0.09704027171276079,
175
+ "grad_norm": 0.27743828296661377,
176
+ "learning_rate": 9.780000000000001e-06,
177
+ "loss": 0.0215,
178
+ "step": 600
179
+ },
180
+ {
181
+ "epoch": 0.10108361636745916,
182
+ "grad_norm": 0.23157595098018646,
183
+ "learning_rate": 9.724444444444445e-06,
184
+ "loss": 0.023,
185
+ "step": 625
186
+ },
187
+ {
188
+ "epoch": 0.10512696102215753,
189
+ "grad_norm": 0.2561393082141876,
190
+ "learning_rate": 9.66888888888889e-06,
191
+ "loss": 0.0209,
192
+ "step": 650
193
+ },
194
+ {
195
+ "epoch": 0.1091703056768559,
196
+ "grad_norm": 0.2518314719200134,
197
+ "learning_rate": 9.613333333333335e-06,
198
+ "loss": 0.0235,
199
+ "step": 675
200
+ },
201
+ {
202
+ "epoch": 0.11321365033155426,
203
+ "grad_norm": 0.2346208244562149,
204
+ "learning_rate": 9.557777777777777e-06,
205
+ "loss": 0.0222,
206
+ "step": 700
207
+ },
208
+ {
209
+ "epoch": 0.11725699498625262,
210
+ "grad_norm": 0.23254457116127014,
211
+ "learning_rate": 9.502222222222223e-06,
212
+ "loss": 0.0216,
213
+ "step": 725
214
+ },
215
+ {
216
+ "epoch": 0.121300339640951,
217
+ "grad_norm": 0.2728439271450043,
218
+ "learning_rate": 9.446666666666667e-06,
219
+ "loss": 0.0208,
220
+ "step": 750
221
+ },
222
+ {
223
+ "epoch": 0.12534368429564935,
224
+ "grad_norm": 0.24401770532131195,
225
+ "learning_rate": 9.391111111111111e-06,
226
+ "loss": 0.0222,
227
+ "step": 775
228
+ },
229
+ {
230
+ "epoch": 0.12938702895034773,
231
+ "grad_norm": 0.2546014189720154,
232
+ "learning_rate": 9.335555555555557e-06,
233
+ "loss": 0.0223,
234
+ "step": 800
235
+ },
236
+ {
237
+ "epoch": 0.1334303736050461,
238
+ "grad_norm": 0.20592975616455078,
239
+ "learning_rate": 9.280000000000001e-06,
240
+ "loss": 0.0214,
241
+ "step": 825
242
+ },
243
+ {
244
+ "epoch": 0.13747371825974447,
245
+ "grad_norm": 0.2854577898979187,
246
+ "learning_rate": 9.224444444444445e-06,
247
+ "loss": 0.021,
248
+ "step": 850
249
+ },
250
+ {
251
+ "epoch": 0.14151706291444283,
252
+ "grad_norm": 0.2555174231529236,
253
+ "learning_rate": 9.168888888888889e-06,
254
+ "loss": 0.0216,
255
+ "step": 875
256
+ },
257
+ {
258
+ "epoch": 0.14556040756914118,
259
+ "grad_norm": 0.22724099457263947,
260
+ "learning_rate": 9.113333333333335e-06,
261
+ "loss": 0.0226,
262
+ "step": 900
263
+ },
264
+ {
265
+ "epoch": 0.14960375222383956,
266
+ "grad_norm": 0.24628663063049316,
267
+ "learning_rate": 9.057777777777779e-06,
268
+ "loss": 0.021,
269
+ "step": 925
270
+ },
271
+ {
272
+ "epoch": 0.15364709687853792,
273
+ "grad_norm": 0.22158333659172058,
274
+ "learning_rate": 9.002222222222223e-06,
275
+ "loss": 0.0221,
276
+ "step": 950
277
+ },
278
+ {
279
+ "epoch": 0.1576904415332363,
280
+ "grad_norm": 0.23985975980758667,
281
+ "learning_rate": 8.946666666666669e-06,
282
+ "loss": 0.0227,
283
+ "step": 975
284
+ },
285
+ {
286
+ "epoch": 0.16173378618793466,
287
+ "grad_norm": 0.24263052642345428,
288
+ "learning_rate": 8.891111111111111e-06,
289
+ "loss": 0.021,
290
+ "step": 1000
291
+ },
292
+ {
293
+ "epoch": 0.16173378618793466,
294
+ "eval_loss": 0.026818539947271347,
295
+ "eval_runtime": 15594.6052,
296
+ "eval_samples_per_second": 7.9,
297
+ "eval_steps_per_second": 0.494,
298
+ "eval_wer": 84.01915040370025,
299
+ "step": 1000
300
+ }
301
+ ],
302
+ "logging_steps": 25,
303
+ "max_steps": 5000,
304
+ "num_input_tokens_seen": 0,
305
+ "num_train_epochs": 1,
306
+ "save_steps": 1000,
307
+ "stateful_callbacks": {
308
+ "TrainerControl": {
309
+ "args": {
310
+ "should_epoch_stop": false,
311
+ "should_evaluate": false,
312
+ "should_log": false,
313
+ "should_save": true,
314
+ "should_training_stop": false
315
+ },
316
+ "attributes": {}
317
+ }
318
+ },
319
+ "total_flos": 9.23473281024e+18,
320
+ "train_batch_size": 32,
321
+ "trial_name": null,
322
+ "trial_params": null
323
+ }
checkpoint-1000/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f9f7caca91e9e74c1f1f32c6f743e7a01e94bbb4a987606d363be4bc023f04fb
3
+ size 5496
checkpoint-2000/config.json ADDED
@@ -0,0 +1,60 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "activation_dropout": 0.0,
3
+ "activation_function": "gelu",
4
+ "apply_spec_augment": false,
5
+ "architectures": [
6
+ "WhisperForConditionalGeneration"
7
+ ],
8
+ "attention_dropout": 0.0,
9
+ "begin_suppress_tokens": null,
10
+ "bos_token_id": 50257,
11
+ "classifier_proj_size": 256,
12
+ "d_model": 768,
13
+ "decoder_attention_heads": 12,
14
+ "decoder_ffn_dim": 3072,
15
+ "decoder_layerdrop": 0.0,
16
+ "decoder_layers": 12,
17
+ "decoder_start_token_id": 50258,
18
+ "dropout": 0.0,
19
+ "encoder_attention_heads": 12,
20
+ "encoder_ffn_dim": 3072,
21
+ "encoder_layerdrop": 0.0,
22
+ "encoder_layers": 12,
23
+ "eos_token_id": 50257,
24
+ "forced_decoder_ids": [
25
+ [
26
+ 1,
27
+ 50259
28
+ ],
29
+ [
30
+ 2,
31
+ 50359
32
+ ],
33
+ [
34
+ 3,
35
+ 50363
36
+ ]
37
+ ],
38
+ "init_std": 0.02,
39
+ "is_encoder_decoder": true,
40
+ "mask_feature_length": 10,
41
+ "mask_feature_min_masks": 0,
42
+ "mask_feature_prob": 0.0,
43
+ "mask_time_length": 10,
44
+ "mask_time_min_masks": 2,
45
+ "mask_time_prob": 0.05,
46
+ "max_length": null,
47
+ "max_source_positions": 1500,
48
+ "max_target_positions": 448,
49
+ "median_filter_width": 7,
50
+ "model_type": "whisper",
51
+ "num_hidden_layers": 12,
52
+ "num_mel_bins": 80,
53
+ "pad_token_id": 50257,
54
+ "scale_embedding": false,
55
+ "torch_dtype": "float32",
56
+ "transformers_version": "4.51.3",
57
+ "use_cache": true,
58
+ "use_weighted_layer_sum": false,
59
+ "vocab_size": 51865
60
+ }
checkpoint-2000/generation_config.json ADDED
@@ -0,0 +1,254 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alignment_heads": [
3
+ [
4
+ 5,
5
+ 3
6
+ ],
7
+ [
8
+ 5,
9
+ 9
10
+ ],
11
+ [
12
+ 8,
13
+ 0
14
+ ],
15
+ [
16
+ 8,
17
+ 4
18
+ ],
19
+ [
20
+ 8,
21
+ 7
22
+ ],
23
+ [
24
+ 8,
25
+ 8
26
+ ],
27
+ [
28
+ 9,
29
+ 0
30
+ ],
31
+ [
32
+ 9,
33
+ 7
34
+ ],
35
+ [
36
+ 9,
37
+ 9
38
+ ],
39
+ [
40
+ 10,
41
+ 5
42
+ ]
43
+ ],
44
+ "begin_suppress_tokens": [
45
+ 220,
46
+ 50257
47
+ ],
48
+ "bos_token_id": 50257,
49
+ "decoder_start_token_id": 50258,
50
+ "eos_token_id": 50257,
51
+ "is_multilingual": true,
52
+ "lang_to_id": {
53
+ "<|af|>": 50327,
54
+ "<|am|>": 50334,
55
+ "<|ar|>": 50272,
56
+ "<|as|>": 50350,
57
+ "<|az|>": 50304,
58
+ "<|ba|>": 50355,
59
+ "<|be|>": 50330,
60
+ "<|bg|>": 50292,
61
+ "<|bn|>": 50302,
62
+ "<|bo|>": 50347,
63
+ "<|br|>": 50309,
64
+ "<|bs|>": 50315,
65
+ "<|ca|>": 50270,
66
+ "<|cs|>": 50283,
67
+ "<|cy|>": 50297,
68
+ "<|da|>": 50285,
69
+ "<|de|>": 50261,
70
+ "<|el|>": 50281,
71
+ "<|en|>": 50259,
72
+ "<|es|>": 50262,
73
+ "<|et|>": 50307,
74
+ "<|eu|>": 50310,
75
+ "<|fa|>": 50300,
76
+ "<|fi|>": 50277,
77
+ "<|fo|>": 50338,
78
+ "<|fr|>": 50265,
79
+ "<|gl|>": 50319,
80
+ "<|gu|>": 50333,
81
+ "<|haw|>": 50352,
82
+ "<|ha|>": 50354,
83
+ "<|he|>": 50279,
84
+ "<|hi|>": 50276,
85
+ "<|hr|>": 50291,
86
+ "<|ht|>": 50339,
87
+ "<|hu|>": 50286,
88
+ "<|hy|>": 50312,
89
+ "<|id|>": 50275,
90
+ "<|is|>": 50311,
91
+ "<|it|>": 50274,
92
+ "<|ja|>": 50266,
93
+ "<|jw|>": 50356,
94
+ "<|ka|>": 50329,
95
+ "<|kk|>": 50316,
96
+ "<|km|>": 50323,
97
+ "<|kn|>": 50306,
98
+ "<|ko|>": 50264,
99
+ "<|la|>": 50294,
100
+ "<|lb|>": 50345,
101
+ "<|ln|>": 50353,
102
+ "<|lo|>": 50336,
103
+ "<|lt|>": 50293,
104
+ "<|lv|>": 50301,
105
+ "<|mg|>": 50349,
106
+ "<|mi|>": 50295,
107
+ "<|mk|>": 50308,
108
+ "<|ml|>": 50296,
109
+ "<|mn|>": 50314,
110
+ "<|mr|>": 50320,
111
+ "<|ms|>": 50282,
112
+ "<|mt|>": 50343,
113
+ "<|my|>": 50346,
114
+ "<|ne|>": 50313,
115
+ "<|nl|>": 50271,
116
+ "<|nn|>": 50342,
117
+ "<|no|>": 50288,
118
+ "<|oc|>": 50328,
119
+ "<|pa|>": 50321,
120
+ "<|pl|>": 50269,
121
+ "<|ps|>": 50340,
122
+ "<|pt|>": 50267,
123
+ "<|ro|>": 50284,
124
+ "<|ru|>": 50263,
125
+ "<|sa|>": 50344,
126
+ "<|sd|>": 50332,
127
+ "<|si|>": 50322,
128
+ "<|sk|>": 50298,
129
+ "<|sl|>": 50305,
130
+ "<|sn|>": 50324,
131
+ "<|so|>": 50326,
132
+ "<|sq|>": 50317,
133
+ "<|sr|>": 50303,
134
+ "<|su|>": 50357,
135
+ "<|sv|>": 50273,
136
+ "<|sw|>": 50318,
137
+ "<|ta|>": 50287,
138
+ "<|te|>": 50299,
139
+ "<|tg|>": 50331,
140
+ "<|th|>": 50289,
141
+ "<|tk|>": 50341,
142
+ "<|tl|>": 50348,
143
+ "<|tr|>": 50268,
144
+ "<|tt|>": 50351,
145
+ "<|uk|>": 50280,
146
+ "<|ur|>": 50290,
147
+ "<|uz|>": 50337,
148
+ "<|vi|>": 50278,
149
+ "<|yi|>": 50335,
150
+ "<|yo|>": 50325,
151
+ "<|zh|>": 50260
152
+ },
153
+ "language": "chinese",
154
+ "max_initial_timestamp_index": 50,
155
+ "max_length": 448,
156
+ "no_timestamps_token_id": 50363,
157
+ "pad_token_id": 50257,
158
+ "prev_sot_token_id": 50361,
159
+ "return_timestamps": false,
160
+ "suppress_tokens": [
161
+ 1,
162
+ 2,
163
+ 7,
164
+ 8,
165
+ 9,
166
+ 10,
167
+ 14,
168
+ 25,
169
+ 26,
170
+ 27,
171
+ 28,
172
+ 29,
173
+ 31,
174
+ 58,
175
+ 59,
176
+ 60,
177
+ 61,
178
+ 62,
179
+ 63,
180
+ 90,
181
+ 91,
182
+ 92,
183
+ 93,
184
+ 359,
185
+ 503,
186
+ 522,
187
+ 542,
188
+ 873,
189
+ 893,
190
+ 902,
191
+ 918,
192
+ 922,
193
+ 931,
194
+ 1350,
195
+ 1853,
196
+ 1982,
197
+ 2460,
198
+ 2627,
199
+ 3246,
200
+ 3253,
201
+ 3268,
202
+ 3536,
203
+ 3846,
204
+ 3961,
205
+ 4183,
206
+ 4667,
207
+ 6585,
208
+ 6647,
209
+ 7273,
210
+ 9061,
211
+ 9383,
212
+ 10428,
213
+ 10929,
214
+ 11938,
215
+ 12033,
216
+ 12331,
217
+ 12562,
218
+ 13793,
219
+ 14157,
220
+ 14635,
221
+ 15265,
222
+ 15618,
223
+ 16553,
224
+ 16604,
225
+ 18362,
226
+ 18956,
227
+ 20075,
228
+ 21675,
229
+ 22520,
230
+ 26130,
231
+ 26161,
232
+ 26435,
233
+ 28279,
234
+ 29464,
235
+ 31650,
236
+ 32302,
237
+ 32470,
238
+ 36865,
239
+ 42863,
240
+ 47425,
241
+ 49870,
242
+ 50254,
243
+ 50258,
244
+ 50360,
245
+ 50361,
246
+ 50362
247
+ ],
248
+ "task": "transcribe",
249
+ "task_to_id": {
250
+ "transcribe": 50359,
251
+ "translate": 50358
252
+ },
253
+ "transformers_version": "4.51.3"
254
+ }
checkpoint-2000/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b580d6cbf357f914ebad517ce76e9a590906d36f2e41208991296583e766f136
3
+ size 966995080
checkpoint-2000/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5d9cb7977441320345aee891ec9a72b144d737f08b54ec5ce5e0d0943eacb5ac
3
+ size 1925064044
checkpoint-2000/preprocessor_config.json ADDED
@@ -0,0 +1,15 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "chunk_length": 30,
3
+ "dither": 0.0,
4
+ "feature_extractor_type": "WhisperFeatureExtractor",
5
+ "feature_size": 80,
6
+ "hop_length": 160,
7
+ "n_fft": 400,
8
+ "n_samples": 480000,
9
+ "nb_max_frames": 3000,
10
+ "padding_side": "right",
11
+ "padding_value": 0.0,
12
+ "processor_class": "WhisperProcessor",
13
+ "return_attention_mask": false,
14
+ "sampling_rate": 16000
15
+ }
checkpoint-2000/rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f6346395eeb6a1db290bca4627171249205abfdc893301831d9394a6a8682fca
3
+ size 14244
checkpoint-2000/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:29fb9e79fa30fbb431af919246a50a3118e2599b8f861d1f7ece53767b613869
3
+ size 1064
checkpoint-2000/trainer_state.json ADDED
@@ -0,0 +1,612 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_global_step": 2000,
3
+ "best_metric": 82.5098389256299,
4
+ "best_model_checkpoint": "./working_area/output_model/checkpoint-2000",
5
+ "epoch": 0.3234675723758693,
6
+ "eval_steps": 1000,
7
+ "global_step": 2000,
8
+ "is_hyper_param_search": false,
9
+ "is_local_process_zero": true,
10
+ "is_world_process_zero": true,
11
+ "log_history": [
12
+ {
13
+ "epoch": 0.004043344654698367,
14
+ "grad_norm": 10.67656421661377,
15
+ "learning_rate": 4.800000000000001e-07,
16
+ "loss": 0.2728,
17
+ "step": 25
18
+ },
19
+ {
20
+ "epoch": 0.008086689309396733,
21
+ "grad_norm": 0.34478363394737244,
22
+ "learning_rate": 9.800000000000001e-07,
23
+ "loss": 0.0358,
24
+ "step": 50
25
+ },
26
+ {
27
+ "epoch": 0.012130033964095099,
28
+ "grad_norm": 0.26953190565109253,
29
+ "learning_rate": 1.48e-06,
30
+ "loss": 0.0261,
31
+ "step": 75
32
+ },
33
+ {
34
+ "epoch": 0.016173378618793467,
35
+ "grad_norm": 0.2419331669807434,
36
+ "learning_rate": 1.98e-06,
37
+ "loss": 0.0251,
38
+ "step": 100
39
+ },
40
+ {
41
+ "epoch": 0.020216723273491832,
42
+ "grad_norm": 0.29993513226509094,
43
+ "learning_rate": 2.4800000000000004e-06,
44
+ "loss": 0.0247,
45
+ "step": 125
46
+ },
47
+ {
48
+ "epoch": 0.024260067928190198,
49
+ "grad_norm": 0.2362569123506546,
50
+ "learning_rate": 2.9800000000000003e-06,
51
+ "loss": 0.0227,
52
+ "step": 150
53
+ },
54
+ {
55
+ "epoch": 0.028303412582888564,
56
+ "grad_norm": 0.2520463466644287,
57
+ "learning_rate": 3.48e-06,
58
+ "loss": 0.0246,
59
+ "step": 175
60
+ },
61
+ {
62
+ "epoch": 0.03234675723758693,
63
+ "grad_norm": 0.2729567885398865,
64
+ "learning_rate": 3.980000000000001e-06,
65
+ "loss": 0.0236,
66
+ "step": 200
67
+ },
68
+ {
69
+ "epoch": 0.036390101892285295,
70
+ "grad_norm": 0.21665456891059875,
71
+ "learning_rate": 4.48e-06,
72
+ "loss": 0.0242,
73
+ "step": 225
74
+ },
75
+ {
76
+ "epoch": 0.040433446546983665,
77
+ "grad_norm": 0.27291786670684814,
78
+ "learning_rate": 4.980000000000001e-06,
79
+ "loss": 0.0242,
80
+ "step": 250
81
+ },
82
+ {
83
+ "epoch": 0.044476791201682034,
84
+ "grad_norm": 0.20698551833629608,
85
+ "learning_rate": 5.480000000000001e-06,
86
+ "loss": 0.0229,
87
+ "step": 275
88
+ },
89
+ {
90
+ "epoch": 0.048520135856380396,
91
+ "grad_norm": 0.2506471276283264,
92
+ "learning_rate": 5.98e-06,
93
+ "loss": 0.0225,
94
+ "step": 300
95
+ },
96
+ {
97
+ "epoch": 0.052563480511078765,
98
+ "grad_norm": 0.24540211260318756,
99
+ "learning_rate": 6.480000000000001e-06,
100
+ "loss": 0.0231,
101
+ "step": 325
102
+ },
103
+ {
104
+ "epoch": 0.05660682516577713,
105
+ "grad_norm": 0.2298947423696518,
106
+ "learning_rate": 6.98e-06,
107
+ "loss": 0.0223,
108
+ "step": 350
109
+ },
110
+ {
111
+ "epoch": 0.0606501698204755,
112
+ "grad_norm": 0.19595475494861603,
113
+ "learning_rate": 7.48e-06,
114
+ "loss": 0.022,
115
+ "step": 375
116
+ },
117
+ {
118
+ "epoch": 0.06469351447517387,
119
+ "grad_norm": 0.2605821490287781,
120
+ "learning_rate": 7.980000000000002e-06,
121
+ "loss": 0.022,
122
+ "step": 400
123
+ },
124
+ {
125
+ "epoch": 0.06873685912987224,
126
+ "grad_norm": 0.25346869230270386,
127
+ "learning_rate": 8.48e-06,
128
+ "loss": 0.0226,
129
+ "step": 425
130
+ },
131
+ {
132
+ "epoch": 0.07278020378457059,
133
+ "grad_norm": 0.255500465631485,
134
+ "learning_rate": 8.98e-06,
135
+ "loss": 0.0216,
136
+ "step": 450
137
+ },
138
+ {
139
+ "epoch": 0.07682354843926896,
140
+ "grad_norm": 0.24462725222110748,
141
+ "learning_rate": 9.48e-06,
142
+ "loss": 0.0234,
143
+ "step": 475
144
+ },
145
+ {
146
+ "epoch": 0.08086689309396733,
147
+ "grad_norm": 0.23454348742961884,
148
+ "learning_rate": 9.980000000000001e-06,
149
+ "loss": 0.024,
150
+ "step": 500
151
+ },
152
+ {
153
+ "epoch": 0.0849102377486657,
154
+ "grad_norm": 0.2627425193786621,
155
+ "learning_rate": 9.946666666666667e-06,
156
+ "loss": 0.0226,
157
+ "step": 525
158
+ },
159
+ {
160
+ "epoch": 0.08895358240336407,
161
+ "grad_norm": 0.22754116356372833,
162
+ "learning_rate": 9.891111111111113e-06,
163
+ "loss": 0.023,
164
+ "step": 550
165
+ },
166
+ {
167
+ "epoch": 0.09299692705806242,
168
+ "grad_norm": 0.26447415351867676,
169
+ "learning_rate": 9.835555555555556e-06,
170
+ "loss": 0.0215,
171
+ "step": 575
172
+ },
173
+ {
174
+ "epoch": 0.09704027171276079,
175
+ "grad_norm": 0.27743828296661377,
176
+ "learning_rate": 9.780000000000001e-06,
177
+ "loss": 0.0215,
178
+ "step": 600
179
+ },
180
+ {
181
+ "epoch": 0.10108361636745916,
182
+ "grad_norm": 0.23157595098018646,
183
+ "learning_rate": 9.724444444444445e-06,
184
+ "loss": 0.023,
185
+ "step": 625
186
+ },
187
+ {
188
+ "epoch": 0.10512696102215753,
189
+ "grad_norm": 0.2561393082141876,
190
+ "learning_rate": 9.66888888888889e-06,
191
+ "loss": 0.0209,
192
+ "step": 650
193
+ },
194
+ {
195
+ "epoch": 0.1091703056768559,
196
+ "grad_norm": 0.2518314719200134,
197
+ "learning_rate": 9.613333333333335e-06,
198
+ "loss": 0.0235,
199
+ "step": 675
200
+ },
201
+ {
202
+ "epoch": 0.11321365033155426,
203
+ "grad_norm": 0.2346208244562149,
204
+ "learning_rate": 9.557777777777777e-06,
205
+ "loss": 0.0222,
206
+ "step": 700
207
+ },
208
+ {
209
+ "epoch": 0.11725699498625262,
210
+ "grad_norm": 0.23254457116127014,
211
+ "learning_rate": 9.502222222222223e-06,
212
+ "loss": 0.0216,
213
+ "step": 725
214
+ },
215
+ {
216
+ "epoch": 0.121300339640951,
217
+ "grad_norm": 0.2728439271450043,
218
+ "learning_rate": 9.446666666666667e-06,
219
+ "loss": 0.0208,
220
+ "step": 750
221
+ },
222
+ {
223
+ "epoch": 0.12534368429564935,
224
+ "grad_norm": 0.24401770532131195,
225
+ "learning_rate": 9.391111111111111e-06,
226
+ "loss": 0.0222,
227
+ "step": 775
228
+ },
229
+ {
230
+ "epoch": 0.12938702895034773,
231
+ "grad_norm": 0.2546014189720154,
232
+ "learning_rate": 9.335555555555557e-06,
233
+ "loss": 0.0223,
234
+ "step": 800
235
+ },
236
+ {
237
+ "epoch": 0.1334303736050461,
238
+ "grad_norm": 0.20592975616455078,
239
+ "learning_rate": 9.280000000000001e-06,
240
+ "loss": 0.0214,
241
+ "step": 825
242
+ },
243
+ {
244
+ "epoch": 0.13747371825974447,
245
+ "grad_norm": 0.2854577898979187,
246
+ "learning_rate": 9.224444444444445e-06,
247
+ "loss": 0.021,
248
+ "step": 850
249
+ },
250
+ {
251
+ "epoch": 0.14151706291444283,
252
+ "grad_norm": 0.2555174231529236,
253
+ "learning_rate": 9.168888888888889e-06,
254
+ "loss": 0.0216,
255
+ "step": 875
256
+ },
257
+ {
258
+ "epoch": 0.14556040756914118,
259
+ "grad_norm": 0.22724099457263947,
260
+ "learning_rate": 9.113333333333335e-06,
261
+ "loss": 0.0226,
262
+ "step": 900
263
+ },
264
+ {
265
+ "epoch": 0.14960375222383956,
266
+ "grad_norm": 0.24628663063049316,
267
+ "learning_rate": 9.057777777777779e-06,
268
+ "loss": 0.021,
269
+ "step": 925
270
+ },
271
+ {
272
+ "epoch": 0.15364709687853792,
273
+ "grad_norm": 0.22158333659172058,
274
+ "learning_rate": 9.002222222222223e-06,
275
+ "loss": 0.0221,
276
+ "step": 950
277
+ },
278
+ {
279
+ "epoch": 0.1576904415332363,
280
+ "grad_norm": 0.23985975980758667,
281
+ "learning_rate": 8.946666666666669e-06,
282
+ "loss": 0.0227,
283
+ "step": 975
284
+ },
285
+ {
286
+ "epoch": 0.16173378618793466,
287
+ "grad_norm": 0.24263052642345428,
288
+ "learning_rate": 8.891111111111111e-06,
289
+ "loss": 0.021,
290
+ "step": 1000
291
+ },
292
+ {
293
+ "epoch": 0.16173378618793466,
294
+ "eval_loss": 0.026818539947271347,
295
+ "eval_runtime": 15594.6052,
296
+ "eval_samples_per_second": 7.9,
297
+ "eval_steps_per_second": 0.494,
298
+ "eval_wer": 84.01915040370025,
299
+ "step": 1000
300
+ },
301
+ {
302
+ "epoch": 0.165777130842633,
303
+ "grad_norm": 0.24622474610805511,
304
+ "learning_rate": 8.835555555555557e-06,
305
+ "loss": 0.0215,
306
+ "step": 1025
307
+ },
308
+ {
309
+ "epoch": 0.1698204754973314,
310
+ "grad_norm": 0.218623086810112,
311
+ "learning_rate": 8.78e-06,
312
+ "loss": 0.0209,
313
+ "step": 1050
314
+ },
315
+ {
316
+ "epoch": 0.17386382015202975,
317
+ "grad_norm": 0.2270510345697403,
318
+ "learning_rate": 8.724444444444445e-06,
319
+ "loss": 0.0216,
320
+ "step": 1075
321
+ },
322
+ {
323
+ "epoch": 0.17790716480672814,
324
+ "grad_norm": 0.2207878828048706,
325
+ "learning_rate": 8.66888888888889e-06,
326
+ "loss": 0.022,
327
+ "step": 1100
328
+ },
329
+ {
330
+ "epoch": 0.1819505094614265,
331
+ "grad_norm": 0.18590456247329712,
332
+ "learning_rate": 8.613333333333333e-06,
333
+ "loss": 0.022,
334
+ "step": 1125
335
+ },
336
+ {
337
+ "epoch": 0.18599385411612485,
338
+ "grad_norm": 0.1838596612215042,
339
+ "learning_rate": 8.557777777777778e-06,
340
+ "loss": 0.0215,
341
+ "step": 1150
342
+ },
343
+ {
344
+ "epoch": 0.19003719877082323,
345
+ "grad_norm": 0.31669357419013977,
346
+ "learning_rate": 8.502222222222223e-06,
347
+ "loss": 0.0223,
348
+ "step": 1175
349
+ },
350
+ {
351
+ "epoch": 0.19408054342552158,
352
+ "grad_norm": 0.21895891427993774,
353
+ "learning_rate": 8.446666666666668e-06,
354
+ "loss": 0.0209,
355
+ "step": 1200
356
+ },
357
+ {
358
+ "epoch": 0.19812388808021997,
359
+ "grad_norm": 0.20323023200035095,
360
+ "learning_rate": 8.391111111111112e-06,
361
+ "loss": 0.021,
362
+ "step": 1225
363
+ },
364
+ {
365
+ "epoch": 0.20216723273491832,
366
+ "grad_norm": 0.25468146800994873,
367
+ "learning_rate": 8.335555555555556e-06,
368
+ "loss": 0.0224,
369
+ "step": 1250
370
+ },
371
+ {
372
+ "epoch": 0.20621057738961668,
373
+ "grad_norm": 0.20973573625087738,
374
+ "learning_rate": 8.28e-06,
375
+ "loss": 0.0211,
376
+ "step": 1275
377
+ },
378
+ {
379
+ "epoch": 0.21025392204431506,
380
+ "grad_norm": 0.26390501856803894,
381
+ "learning_rate": 8.224444444444444e-06,
382
+ "loss": 0.0208,
383
+ "step": 1300
384
+ },
385
+ {
386
+ "epoch": 0.21429726669901342,
387
+ "grad_norm": 0.2301158905029297,
388
+ "learning_rate": 8.16888888888889e-06,
389
+ "loss": 0.0205,
390
+ "step": 1325
391
+ },
392
+ {
393
+ "epoch": 0.2183406113537118,
394
+ "grad_norm": 0.24158801138401031,
395
+ "learning_rate": 8.113333333333334e-06,
396
+ "loss": 0.0206,
397
+ "step": 1350
398
+ },
399
+ {
400
+ "epoch": 0.22238395600841016,
401
+ "grad_norm": 0.21493862569332123,
402
+ "learning_rate": 8.057777777777778e-06,
403
+ "loss": 0.0208,
404
+ "step": 1375
405
+ },
406
+ {
407
+ "epoch": 0.2264273006631085,
408
+ "grad_norm": 0.2373913675546646,
409
+ "learning_rate": 8.002222222222222e-06,
410
+ "loss": 0.0198,
411
+ "step": 1400
412
+ },
413
+ {
414
+ "epoch": 0.2304706453178069,
415
+ "grad_norm": 0.2382635623216629,
416
+ "learning_rate": 7.946666666666666e-06,
417
+ "loss": 0.0198,
418
+ "step": 1425
419
+ },
420
+ {
421
+ "epoch": 0.23451398997250525,
422
+ "grad_norm": 0.256558895111084,
423
+ "learning_rate": 7.891111111111112e-06,
424
+ "loss": 0.0206,
425
+ "step": 1450
426
+ },
427
+ {
428
+ "epoch": 0.23855733462720363,
429
+ "grad_norm": 0.23155981302261353,
430
+ "learning_rate": 7.835555555555556e-06,
431
+ "loss": 0.0182,
432
+ "step": 1475
433
+ },
434
+ {
435
+ "epoch": 0.242600679281902,
436
+ "grad_norm": 0.24832689762115479,
437
+ "learning_rate": 7.78e-06,
438
+ "loss": 0.02,
439
+ "step": 1500
440
+ },
441
+ {
442
+ "epoch": 0.24664402393660034,
443
+ "grad_norm": 0.22871288657188416,
444
+ "learning_rate": 7.724444444444446e-06,
445
+ "loss": 0.0193,
446
+ "step": 1525
447
+ },
448
+ {
449
+ "epoch": 0.2506873685912987,
450
+ "grad_norm": 0.2521936297416687,
451
+ "learning_rate": 7.66888888888889e-06,
452
+ "loss": 0.0197,
453
+ "step": 1550
454
+ },
455
+ {
456
+ "epoch": 0.2547307132459971,
457
+ "grad_norm": 0.24429504573345184,
458
+ "learning_rate": 7.613333333333334e-06,
459
+ "loss": 0.0221,
460
+ "step": 1575
461
+ },
462
+ {
463
+ "epoch": 0.25877405790069546,
464
+ "grad_norm": 0.23084107041358948,
465
+ "learning_rate": 7.557777777777779e-06,
466
+ "loss": 0.02,
467
+ "step": 1600
468
+ },
469
+ {
470
+ "epoch": 0.2628174025553938,
471
+ "grad_norm": 0.2697436809539795,
472
+ "learning_rate": 7.502222222222223e-06,
473
+ "loss": 0.0211,
474
+ "step": 1625
475
+ },
476
+ {
477
+ "epoch": 0.2668607472100922,
478
+ "grad_norm": 0.23871037364006042,
479
+ "learning_rate": 7.446666666666668e-06,
480
+ "loss": 0.0214,
481
+ "step": 1650
482
+ },
483
+ {
484
+ "epoch": 0.27090409186479053,
485
+ "grad_norm": 0.2677021920681,
486
+ "learning_rate": 7.3911111111111125e-06,
487
+ "loss": 0.0203,
488
+ "step": 1675
489
+ },
490
+ {
491
+ "epoch": 0.27494743651948894,
492
+ "grad_norm": 0.1928839236497879,
493
+ "learning_rate": 7.335555555555556e-06,
494
+ "loss": 0.0203,
495
+ "step": 1700
496
+ },
497
+ {
498
+ "epoch": 0.2789907811741873,
499
+ "grad_norm": 0.26213887333869934,
500
+ "learning_rate": 7.280000000000001e-06,
501
+ "loss": 0.0205,
502
+ "step": 1725
503
+ },
504
+ {
505
+ "epoch": 0.28303412582888565,
506
+ "grad_norm": 0.23492328822612762,
507
+ "learning_rate": 7.224444444444445e-06,
508
+ "loss": 0.0201,
509
+ "step": 1750
510
+ },
511
+ {
512
+ "epoch": 0.287077470483584,
513
+ "grad_norm": 0.23946373164653778,
514
+ "learning_rate": 7.1688888888888895e-06,
515
+ "loss": 0.0195,
516
+ "step": 1775
517
+ },
518
+ {
519
+ "epoch": 0.29112081513828236,
520
+ "grad_norm": 0.20817314088344574,
521
+ "learning_rate": 7.113333333333334e-06,
522
+ "loss": 0.0211,
523
+ "step": 1800
524
+ },
525
+ {
526
+ "epoch": 0.2951641597929808,
527
+ "grad_norm": 0.22937080264091492,
528
+ "learning_rate": 7.057777777777778e-06,
529
+ "loss": 0.0203,
530
+ "step": 1825
531
+ },
532
+ {
533
+ "epoch": 0.29920750444767913,
534
+ "grad_norm": 0.23442518711090088,
535
+ "learning_rate": 7.0022222222222225e-06,
536
+ "loss": 0.0197,
537
+ "step": 1850
538
+ },
539
+ {
540
+ "epoch": 0.3032508491023775,
541
+ "grad_norm": 0.24289068579673767,
542
+ "learning_rate": 6.946666666666667e-06,
543
+ "loss": 0.0198,
544
+ "step": 1875
545
+ },
546
+ {
547
+ "epoch": 0.30729419375707584,
548
+ "grad_norm": 0.24261179566383362,
549
+ "learning_rate": 6.891111111111111e-06,
550
+ "loss": 0.0211,
551
+ "step": 1900
552
+ },
553
+ {
554
+ "epoch": 0.3113375384117742,
555
+ "grad_norm": 0.21574310958385468,
556
+ "learning_rate": 6.835555555555556e-06,
557
+ "loss": 0.0196,
558
+ "step": 1925
559
+ },
560
+ {
561
+ "epoch": 0.3153808830664726,
562
+ "grad_norm": 0.2648760676383972,
563
+ "learning_rate": 6.780000000000001e-06,
564
+ "loss": 0.021,
565
+ "step": 1950
566
+ },
567
+ {
568
+ "epoch": 0.31942422772117096,
569
+ "grad_norm": 0.22739122807979584,
570
+ "learning_rate": 6.724444444444444e-06,
571
+ "loss": 0.019,
572
+ "step": 1975
573
+ },
574
+ {
575
+ "epoch": 0.3234675723758693,
576
+ "grad_norm": 0.2167060226202011,
577
+ "learning_rate": 6.668888888888889e-06,
578
+ "loss": 0.0208,
579
+ "step": 2000
580
+ },
581
+ {
582
+ "epoch": 0.3234675723758693,
583
+ "eval_loss": 0.025321291759610176,
584
+ "eval_runtime": 14275.0619,
585
+ "eval_samples_per_second": 8.63,
586
+ "eval_steps_per_second": 0.539,
587
+ "eval_wer": 82.5098389256299,
588
+ "step": 2000
589
+ }
590
+ ],
591
+ "logging_steps": 25,
592
+ "max_steps": 5000,
593
+ "num_input_tokens_seen": 0,
594
+ "num_train_epochs": 1,
595
+ "save_steps": 1000,
596
+ "stateful_callbacks": {
597
+ "TrainerControl": {
598
+ "args": {
599
+ "should_epoch_stop": false,
600
+ "should_evaluate": false,
601
+ "should_log": false,
602
+ "should_save": true,
603
+ "should_training_stop": false
604
+ },
605
+ "attributes": {}
606
+ }
607
+ },
608
+ "total_flos": 1.846946562048e+19,
609
+ "train_batch_size": 32,
610
+ "trial_name": null,
611
+ "trial_params": null
612
+ }
checkpoint-2000/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f9f7caca91e9e74c1f1f32c6f743e7a01e94bbb4a987606d363be4bc023f04fb
3
+ size 5496
checkpoint-3000/config.json ADDED
@@ -0,0 +1,60 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "activation_dropout": 0.0,
3
+ "activation_function": "gelu",
4
+ "apply_spec_augment": false,
5
+ "architectures": [
6
+ "WhisperForConditionalGeneration"
7
+ ],
8
+ "attention_dropout": 0.0,
9
+ "begin_suppress_tokens": null,
10
+ "bos_token_id": 50257,
11
+ "classifier_proj_size": 256,
12
+ "d_model": 768,
13
+ "decoder_attention_heads": 12,
14
+ "decoder_ffn_dim": 3072,
15
+ "decoder_layerdrop": 0.0,
16
+ "decoder_layers": 12,
17
+ "decoder_start_token_id": 50258,
18
+ "dropout": 0.0,
19
+ "encoder_attention_heads": 12,
20
+ "encoder_ffn_dim": 3072,
21
+ "encoder_layerdrop": 0.0,
22
+ "encoder_layers": 12,
23
+ "eos_token_id": 50257,
24
+ "forced_decoder_ids": [
25
+ [
26
+ 1,
27
+ 50259
28
+ ],
29
+ [
30
+ 2,
31
+ 50359
32
+ ],
33
+ [
34
+ 3,
35
+ 50363
36
+ ]
37
+ ],
38
+ "init_std": 0.02,
39
+ "is_encoder_decoder": true,
40
+ "mask_feature_length": 10,
41
+ "mask_feature_min_masks": 0,
42
+ "mask_feature_prob": 0.0,
43
+ "mask_time_length": 10,
44
+ "mask_time_min_masks": 2,
45
+ "mask_time_prob": 0.05,
46
+ "max_length": null,
47
+ "max_source_positions": 1500,
48
+ "max_target_positions": 448,
49
+ "median_filter_width": 7,
50
+ "model_type": "whisper",
51
+ "num_hidden_layers": 12,
52
+ "num_mel_bins": 80,
53
+ "pad_token_id": 50257,
54
+ "scale_embedding": false,
55
+ "torch_dtype": "float32",
56
+ "transformers_version": "4.51.3",
57
+ "use_cache": true,
58
+ "use_weighted_layer_sum": false,
59
+ "vocab_size": 51865
60
+ }
checkpoint-3000/generation_config.json ADDED
@@ -0,0 +1,254 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alignment_heads": [
3
+ [
4
+ 5,
5
+ 3
6
+ ],
7
+ [
8
+ 5,
9
+ 9
10
+ ],
11
+ [
12
+ 8,
13
+ 0
14
+ ],
15
+ [
16
+ 8,
17
+ 4
18
+ ],
19
+ [
20
+ 8,
21
+ 7
22
+ ],
23
+ [
24
+ 8,
25
+ 8
26
+ ],
27
+ [
28
+ 9,
29
+ 0
30
+ ],
31
+ [
32
+ 9,
33
+ 7
34
+ ],
35
+ [
36
+ 9,
37
+ 9
38
+ ],
39
+ [
40
+ 10,
41
+ 5
42
+ ]
43
+ ],
44
+ "begin_suppress_tokens": [
45
+ 220,
46
+ 50257
47
+ ],
48
+ "bos_token_id": 50257,
49
+ "decoder_start_token_id": 50258,
50
+ "eos_token_id": 50257,
51
+ "is_multilingual": true,
52
+ "lang_to_id": {
53
+ "<|af|>": 50327,
54
+ "<|am|>": 50334,
55
+ "<|ar|>": 50272,
56
+ "<|as|>": 50350,
57
+ "<|az|>": 50304,
58
+ "<|ba|>": 50355,
59
+ "<|be|>": 50330,
60
+ "<|bg|>": 50292,
61
+ "<|bn|>": 50302,
62
+ "<|bo|>": 50347,
63
+ "<|br|>": 50309,
64
+ "<|bs|>": 50315,
65
+ "<|ca|>": 50270,
66
+ "<|cs|>": 50283,
67
+ "<|cy|>": 50297,
68
+ "<|da|>": 50285,
69
+ "<|de|>": 50261,
70
+ "<|el|>": 50281,
71
+ "<|en|>": 50259,
72
+ "<|es|>": 50262,
73
+ "<|et|>": 50307,
74
+ "<|eu|>": 50310,
75
+ "<|fa|>": 50300,
76
+ "<|fi|>": 50277,
77
+ "<|fo|>": 50338,
78
+ "<|fr|>": 50265,
79
+ "<|gl|>": 50319,
80
+ "<|gu|>": 50333,
81
+ "<|haw|>": 50352,
82
+ "<|ha|>": 50354,
83
+ "<|he|>": 50279,
84
+ "<|hi|>": 50276,
85
+ "<|hr|>": 50291,
86
+ "<|ht|>": 50339,
87
+ "<|hu|>": 50286,
88
+ "<|hy|>": 50312,
89
+ "<|id|>": 50275,
90
+ "<|is|>": 50311,
91
+ "<|it|>": 50274,
92
+ "<|ja|>": 50266,
93
+ "<|jw|>": 50356,
94
+ "<|ka|>": 50329,
95
+ "<|kk|>": 50316,
96
+ "<|km|>": 50323,
97
+ "<|kn|>": 50306,
98
+ "<|ko|>": 50264,
99
+ "<|la|>": 50294,
100
+ "<|lb|>": 50345,
101
+ "<|ln|>": 50353,
102
+ "<|lo|>": 50336,
103
+ "<|lt|>": 50293,
104
+ "<|lv|>": 50301,
105
+ "<|mg|>": 50349,
106
+ "<|mi|>": 50295,
107
+ "<|mk|>": 50308,
108
+ "<|ml|>": 50296,
109
+ "<|mn|>": 50314,
110
+ "<|mr|>": 50320,
111
+ "<|ms|>": 50282,
112
+ "<|mt|>": 50343,
113
+ "<|my|>": 50346,
114
+ "<|ne|>": 50313,
115
+ "<|nl|>": 50271,
116
+ "<|nn|>": 50342,
117
+ "<|no|>": 50288,
118
+ "<|oc|>": 50328,
119
+ "<|pa|>": 50321,
120
+ "<|pl|>": 50269,
121
+ "<|ps|>": 50340,
122
+ "<|pt|>": 50267,
123
+ "<|ro|>": 50284,
124
+ "<|ru|>": 50263,
125
+ "<|sa|>": 50344,
126
+ "<|sd|>": 50332,
127
+ "<|si|>": 50322,
128
+ "<|sk|>": 50298,
129
+ "<|sl|>": 50305,
130
+ "<|sn|>": 50324,
131
+ "<|so|>": 50326,
132
+ "<|sq|>": 50317,
133
+ "<|sr|>": 50303,
134
+ "<|su|>": 50357,
135
+ "<|sv|>": 50273,
136
+ "<|sw|>": 50318,
137
+ "<|ta|>": 50287,
138
+ "<|te|>": 50299,
139
+ "<|tg|>": 50331,
140
+ "<|th|>": 50289,
141
+ "<|tk|>": 50341,
142
+ "<|tl|>": 50348,
143
+ "<|tr|>": 50268,
144
+ "<|tt|>": 50351,
145
+ "<|uk|>": 50280,
146
+ "<|ur|>": 50290,
147
+ "<|uz|>": 50337,
148
+ "<|vi|>": 50278,
149
+ "<|yi|>": 50335,
150
+ "<|yo|>": 50325,
151
+ "<|zh|>": 50260
152
+ },
153
+ "language": "chinese",
154
+ "max_initial_timestamp_index": 50,
155
+ "max_length": 448,
156
+ "no_timestamps_token_id": 50363,
157
+ "pad_token_id": 50257,
158
+ "prev_sot_token_id": 50361,
159
+ "return_timestamps": false,
160
+ "suppress_tokens": [
161
+ 1,
162
+ 2,
163
+ 7,
164
+ 8,
165
+ 9,
166
+ 10,
167
+ 14,
168
+ 25,
169
+ 26,
170
+ 27,
171
+ 28,
172
+ 29,
173
+ 31,
174
+ 58,
175
+ 59,
176
+ 60,
177
+ 61,
178
+ 62,
179
+ 63,
180
+ 90,
181
+ 91,
182
+ 92,
183
+ 93,
184
+ 359,
185
+ 503,
186
+ 522,
187
+ 542,
188
+ 873,
189
+ 893,
190
+ 902,
191
+ 918,
192
+ 922,
193
+ 931,
194
+ 1350,
195
+ 1853,
196
+ 1982,
197
+ 2460,
198
+ 2627,
199
+ 3246,
200
+ 3253,
201
+ 3268,
202
+ 3536,
203
+ 3846,
204
+ 3961,
205
+ 4183,
206
+ 4667,
207
+ 6585,
208
+ 6647,
209
+ 7273,
210
+ 9061,
211
+ 9383,
212
+ 10428,
213
+ 10929,
214
+ 11938,
215
+ 12033,
216
+ 12331,
217
+ 12562,
218
+ 13793,
219
+ 14157,
220
+ 14635,
221
+ 15265,
222
+ 15618,
223
+ 16553,
224
+ 16604,
225
+ 18362,
226
+ 18956,
227
+ 20075,
228
+ 21675,
229
+ 22520,
230
+ 26130,
231
+ 26161,
232
+ 26435,
233
+ 28279,
234
+ 29464,
235
+ 31650,
236
+ 32302,
237
+ 32470,
238
+ 36865,
239
+ 42863,
240
+ 47425,
241
+ 49870,
242
+ 50254,
243
+ 50258,
244
+ 50360,
245
+ 50361,
246
+ 50362
247
+ ],
248
+ "task": "transcribe",
249
+ "task_to_id": {
250
+ "transcribe": 50359,
251
+ "translate": 50358
252
+ },
253
+ "transformers_version": "4.51.3"
254
+ }
checkpoint-3000/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1a0dca8ecba2ae10035d62922ab0dcd365756245864e64cc98184d06fad4d7ef
3
+ size 966995080
checkpoint-3000/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a61a2b3f926427cafc2635425cf5fa1968e95ad67254511fb2e0a211be794ead
3
+ size 1925064044
checkpoint-3000/preprocessor_config.json ADDED
@@ -0,0 +1,15 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "chunk_length": 30,
3
+ "dither": 0.0,
4
+ "feature_extractor_type": "WhisperFeatureExtractor",
5
+ "feature_size": 80,
6
+ "hop_length": 160,
7
+ "n_fft": 400,
8
+ "n_samples": 480000,
9
+ "nb_max_frames": 3000,
10
+ "padding_side": "right",
11
+ "padding_value": 0.0,
12
+ "processor_class": "WhisperProcessor",
13
+ "return_attention_mask": false,
14
+ "sampling_rate": 16000
15
+ }
checkpoint-3000/rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:939a50d1c52ed45a65280c4aba95bbf531f90fb2220b041c3786ae8cc94573d2
3
+ size 14244
checkpoint-3000/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:17af6a83bb1cb19cd0edadcdd8667775ae13ecbc6438dd8bbc5fbd929a74874b
3
+ size 1064
checkpoint-3000/trainer_state.json ADDED
@@ -0,0 +1,901 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_global_step": 3000,
3
+ "best_metric": 81.96210492149146,
4
+ "best_model_checkpoint": "./working_area/output_model/checkpoint-3000",
5
+ "epoch": 0.485201358563804,
6
+ "eval_steps": 1000,
7
+ "global_step": 3000,
8
+ "is_hyper_param_search": false,
9
+ "is_local_process_zero": true,
10
+ "is_world_process_zero": true,
11
+ "log_history": [
12
+ {
13
+ "epoch": 0.004043344654698367,
14
+ "grad_norm": 10.67656421661377,
15
+ "learning_rate": 4.800000000000001e-07,
16
+ "loss": 0.2728,
17
+ "step": 25
18
+ },
19
+ {
20
+ "epoch": 0.008086689309396733,
21
+ "grad_norm": 0.34478363394737244,
22
+ "learning_rate": 9.800000000000001e-07,
23
+ "loss": 0.0358,
24
+ "step": 50
25
+ },
26
+ {
27
+ "epoch": 0.012130033964095099,
28
+ "grad_norm": 0.26953190565109253,
29
+ "learning_rate": 1.48e-06,
30
+ "loss": 0.0261,
31
+ "step": 75
32
+ },
33
+ {
34
+ "epoch": 0.016173378618793467,
35
+ "grad_norm": 0.2419331669807434,
36
+ "learning_rate": 1.98e-06,
37
+ "loss": 0.0251,
38
+ "step": 100
39
+ },
40
+ {
41
+ "epoch": 0.020216723273491832,
42
+ "grad_norm": 0.29993513226509094,
43
+ "learning_rate": 2.4800000000000004e-06,
44
+ "loss": 0.0247,
45
+ "step": 125
46
+ },
47
+ {
48
+ "epoch": 0.024260067928190198,
49
+ "grad_norm": 0.2362569123506546,
50
+ "learning_rate": 2.9800000000000003e-06,
51
+ "loss": 0.0227,
52
+ "step": 150
53
+ },
54
+ {
55
+ "epoch": 0.028303412582888564,
56
+ "grad_norm": 0.2520463466644287,
57
+ "learning_rate": 3.48e-06,
58
+ "loss": 0.0246,
59
+ "step": 175
60
+ },
61
+ {
62
+ "epoch": 0.03234675723758693,
63
+ "grad_norm": 0.2729567885398865,
64
+ "learning_rate": 3.980000000000001e-06,
65
+ "loss": 0.0236,
66
+ "step": 200
67
+ },
68
+ {
69
+ "epoch": 0.036390101892285295,
70
+ "grad_norm": 0.21665456891059875,
71
+ "learning_rate": 4.48e-06,
72
+ "loss": 0.0242,
73
+ "step": 225
74
+ },
75
+ {
76
+ "epoch": 0.040433446546983665,
77
+ "grad_norm": 0.27291786670684814,
78
+ "learning_rate": 4.980000000000001e-06,
79
+ "loss": 0.0242,
80
+ "step": 250
81
+ },
82
+ {
83
+ "epoch": 0.044476791201682034,
84
+ "grad_norm": 0.20698551833629608,
85
+ "learning_rate": 5.480000000000001e-06,
86
+ "loss": 0.0229,
87
+ "step": 275
88
+ },
89
+ {
90
+ "epoch": 0.048520135856380396,
91
+ "grad_norm": 0.2506471276283264,
92
+ "learning_rate": 5.98e-06,
93
+ "loss": 0.0225,
94
+ "step": 300
95
+ },
96
+ {
97
+ "epoch": 0.052563480511078765,
98
+ "grad_norm": 0.24540211260318756,
99
+ "learning_rate": 6.480000000000001e-06,
100
+ "loss": 0.0231,
101
+ "step": 325
102
+ },
103
+ {
104
+ "epoch": 0.05660682516577713,
105
+ "grad_norm": 0.2298947423696518,
106
+ "learning_rate": 6.98e-06,
107
+ "loss": 0.0223,
108
+ "step": 350
109
+ },
110
+ {
111
+ "epoch": 0.0606501698204755,
112
+ "grad_norm": 0.19595475494861603,
113
+ "learning_rate": 7.48e-06,
114
+ "loss": 0.022,
115
+ "step": 375
116
+ },
117
+ {
118
+ "epoch": 0.06469351447517387,
119
+ "grad_norm": 0.2605821490287781,
120
+ "learning_rate": 7.980000000000002e-06,
121
+ "loss": 0.022,
122
+ "step": 400
123
+ },
124
+ {
125
+ "epoch": 0.06873685912987224,
126
+ "grad_norm": 0.25346869230270386,
127
+ "learning_rate": 8.48e-06,
128
+ "loss": 0.0226,
129
+ "step": 425
130
+ },
131
+ {
132
+ "epoch": 0.07278020378457059,
133
+ "grad_norm": 0.255500465631485,
134
+ "learning_rate": 8.98e-06,
135
+ "loss": 0.0216,
136
+ "step": 450
137
+ },
138
+ {
139
+ "epoch": 0.07682354843926896,
140
+ "grad_norm": 0.24462725222110748,
141
+ "learning_rate": 9.48e-06,
142
+ "loss": 0.0234,
143
+ "step": 475
144
+ },
145
+ {
146
+ "epoch": 0.08086689309396733,
147
+ "grad_norm": 0.23454348742961884,
148
+ "learning_rate": 9.980000000000001e-06,
149
+ "loss": 0.024,
150
+ "step": 500
151
+ },
152
+ {
153
+ "epoch": 0.0849102377486657,
154
+ "grad_norm": 0.2627425193786621,
155
+ "learning_rate": 9.946666666666667e-06,
156
+ "loss": 0.0226,
157
+ "step": 525
158
+ },
159
+ {
160
+ "epoch": 0.08895358240336407,
161
+ "grad_norm": 0.22754116356372833,
162
+ "learning_rate": 9.891111111111113e-06,
163
+ "loss": 0.023,
164
+ "step": 550
165
+ },
166
+ {
167
+ "epoch": 0.09299692705806242,
168
+ "grad_norm": 0.26447415351867676,
169
+ "learning_rate": 9.835555555555556e-06,
170
+ "loss": 0.0215,
171
+ "step": 575
172
+ },
173
+ {
174
+ "epoch": 0.09704027171276079,
175
+ "grad_norm": 0.27743828296661377,
176
+ "learning_rate": 9.780000000000001e-06,
177
+ "loss": 0.0215,
178
+ "step": 600
179
+ },
180
+ {
181
+ "epoch": 0.10108361636745916,
182
+ "grad_norm": 0.23157595098018646,
183
+ "learning_rate": 9.724444444444445e-06,
184
+ "loss": 0.023,
185
+ "step": 625
186
+ },
187
+ {
188
+ "epoch": 0.10512696102215753,
189
+ "grad_norm": 0.2561393082141876,
190
+ "learning_rate": 9.66888888888889e-06,
191
+ "loss": 0.0209,
192
+ "step": 650
193
+ },
194
+ {
195
+ "epoch": 0.1091703056768559,
196
+ "grad_norm": 0.2518314719200134,
197
+ "learning_rate": 9.613333333333335e-06,
198
+ "loss": 0.0235,
199
+ "step": 675
200
+ },
201
+ {
202
+ "epoch": 0.11321365033155426,
203
+ "grad_norm": 0.2346208244562149,
204
+ "learning_rate": 9.557777777777777e-06,
205
+ "loss": 0.0222,
206
+ "step": 700
207
+ },
208
+ {
209
+ "epoch": 0.11725699498625262,
210
+ "grad_norm": 0.23254457116127014,
211
+ "learning_rate": 9.502222222222223e-06,
212
+ "loss": 0.0216,
213
+ "step": 725
214
+ },
215
+ {
216
+ "epoch": 0.121300339640951,
217
+ "grad_norm": 0.2728439271450043,
218
+ "learning_rate": 9.446666666666667e-06,
219
+ "loss": 0.0208,
220
+ "step": 750
221
+ },
222
+ {
223
+ "epoch": 0.12534368429564935,
224
+ "grad_norm": 0.24401770532131195,
225
+ "learning_rate": 9.391111111111111e-06,
226
+ "loss": 0.0222,
227
+ "step": 775
228
+ },
229
+ {
230
+ "epoch": 0.12938702895034773,
231
+ "grad_norm": 0.2546014189720154,
232
+ "learning_rate": 9.335555555555557e-06,
233
+ "loss": 0.0223,
234
+ "step": 800
235
+ },
236
+ {
237
+ "epoch": 0.1334303736050461,
238
+ "grad_norm": 0.20592975616455078,
239
+ "learning_rate": 9.280000000000001e-06,
240
+ "loss": 0.0214,
241
+ "step": 825
242
+ },
243
+ {
244
+ "epoch": 0.13747371825974447,
245
+ "grad_norm": 0.2854577898979187,
246
+ "learning_rate": 9.224444444444445e-06,
247
+ "loss": 0.021,
248
+ "step": 850
249
+ },
250
+ {
251
+ "epoch": 0.14151706291444283,
252
+ "grad_norm": 0.2555174231529236,
253
+ "learning_rate": 9.168888888888889e-06,
254
+ "loss": 0.0216,
255
+ "step": 875
256
+ },
257
+ {
258
+ "epoch": 0.14556040756914118,
259
+ "grad_norm": 0.22724099457263947,
260
+ "learning_rate": 9.113333333333335e-06,
261
+ "loss": 0.0226,
262
+ "step": 900
263
+ },
264
+ {
265
+ "epoch": 0.14960375222383956,
266
+ "grad_norm": 0.24628663063049316,
267
+ "learning_rate": 9.057777777777779e-06,
268
+ "loss": 0.021,
269
+ "step": 925
270
+ },
271
+ {
272
+ "epoch": 0.15364709687853792,
273
+ "grad_norm": 0.22158333659172058,
274
+ "learning_rate": 9.002222222222223e-06,
275
+ "loss": 0.0221,
276
+ "step": 950
277
+ },
278
+ {
279
+ "epoch": 0.1576904415332363,
280
+ "grad_norm": 0.23985975980758667,
281
+ "learning_rate": 8.946666666666669e-06,
282
+ "loss": 0.0227,
283
+ "step": 975
284
+ },
285
+ {
286
+ "epoch": 0.16173378618793466,
287
+ "grad_norm": 0.24263052642345428,
288
+ "learning_rate": 8.891111111111111e-06,
289
+ "loss": 0.021,
290
+ "step": 1000
291
+ },
292
+ {
293
+ "epoch": 0.16173378618793466,
294
+ "eval_loss": 0.026818539947271347,
295
+ "eval_runtime": 15594.6052,
296
+ "eval_samples_per_second": 7.9,
297
+ "eval_steps_per_second": 0.494,
298
+ "eval_wer": 84.01915040370025,
299
+ "step": 1000
300
+ },
301
+ {
302
+ "epoch": 0.165777130842633,
303
+ "grad_norm": 0.24622474610805511,
304
+ "learning_rate": 8.835555555555557e-06,
305
+ "loss": 0.0215,
306
+ "step": 1025
307
+ },
308
+ {
309
+ "epoch": 0.1698204754973314,
310
+ "grad_norm": 0.218623086810112,
311
+ "learning_rate": 8.78e-06,
312
+ "loss": 0.0209,
313
+ "step": 1050
314
+ },
315
+ {
316
+ "epoch": 0.17386382015202975,
317
+ "grad_norm": 0.2270510345697403,
318
+ "learning_rate": 8.724444444444445e-06,
319
+ "loss": 0.0216,
320
+ "step": 1075
321
+ },
322
+ {
323
+ "epoch": 0.17790716480672814,
324
+ "grad_norm": 0.2207878828048706,
325
+ "learning_rate": 8.66888888888889e-06,
326
+ "loss": 0.022,
327
+ "step": 1100
328
+ },
329
+ {
330
+ "epoch": 0.1819505094614265,
331
+ "grad_norm": 0.18590456247329712,
332
+ "learning_rate": 8.613333333333333e-06,
333
+ "loss": 0.022,
334
+ "step": 1125
335
+ },
336
+ {
337
+ "epoch": 0.18599385411612485,
338
+ "grad_norm": 0.1838596612215042,
339
+ "learning_rate": 8.557777777777778e-06,
340
+ "loss": 0.0215,
341
+ "step": 1150
342
+ },
343
+ {
344
+ "epoch": 0.19003719877082323,
345
+ "grad_norm": 0.31669357419013977,
346
+ "learning_rate": 8.502222222222223e-06,
347
+ "loss": 0.0223,
348
+ "step": 1175
349
+ },
350
+ {
351
+ "epoch": 0.19408054342552158,
352
+ "grad_norm": 0.21895891427993774,
353
+ "learning_rate": 8.446666666666668e-06,
354
+ "loss": 0.0209,
355
+ "step": 1200
356
+ },
357
+ {
358
+ "epoch": 0.19812388808021997,
359
+ "grad_norm": 0.20323023200035095,
360
+ "learning_rate": 8.391111111111112e-06,
361
+ "loss": 0.021,
362
+ "step": 1225
363
+ },
364
+ {
365
+ "epoch": 0.20216723273491832,
366
+ "grad_norm": 0.25468146800994873,
367
+ "learning_rate": 8.335555555555556e-06,
368
+ "loss": 0.0224,
369
+ "step": 1250
370
+ },
371
+ {
372
+ "epoch": 0.20621057738961668,
373
+ "grad_norm": 0.20973573625087738,
374
+ "learning_rate": 8.28e-06,
375
+ "loss": 0.0211,
376
+ "step": 1275
377
+ },
378
+ {
379
+ "epoch": 0.21025392204431506,
380
+ "grad_norm": 0.26390501856803894,
381
+ "learning_rate": 8.224444444444444e-06,
382
+ "loss": 0.0208,
383
+ "step": 1300
384
+ },
385
+ {
386
+ "epoch": 0.21429726669901342,
387
+ "grad_norm": 0.2301158905029297,
388
+ "learning_rate": 8.16888888888889e-06,
389
+ "loss": 0.0205,
390
+ "step": 1325
391
+ },
392
+ {
393
+ "epoch": 0.2183406113537118,
394
+ "grad_norm": 0.24158801138401031,
395
+ "learning_rate": 8.113333333333334e-06,
396
+ "loss": 0.0206,
397
+ "step": 1350
398
+ },
399
+ {
400
+ "epoch": 0.22238395600841016,
401
+ "grad_norm": 0.21493862569332123,
402
+ "learning_rate": 8.057777777777778e-06,
403
+ "loss": 0.0208,
404
+ "step": 1375
405
+ },
406
+ {
407
+ "epoch": 0.2264273006631085,
408
+ "grad_norm": 0.2373913675546646,
409
+ "learning_rate": 8.002222222222222e-06,
410
+ "loss": 0.0198,
411
+ "step": 1400
412
+ },
413
+ {
414
+ "epoch": 0.2304706453178069,
415
+ "grad_norm": 0.2382635623216629,
416
+ "learning_rate": 7.946666666666666e-06,
417
+ "loss": 0.0198,
418
+ "step": 1425
419
+ },
420
+ {
421
+ "epoch": 0.23451398997250525,
422
+ "grad_norm": 0.256558895111084,
423
+ "learning_rate": 7.891111111111112e-06,
424
+ "loss": 0.0206,
425
+ "step": 1450
426
+ },
427
+ {
428
+ "epoch": 0.23855733462720363,
429
+ "grad_norm": 0.23155981302261353,
430
+ "learning_rate": 7.835555555555556e-06,
431
+ "loss": 0.0182,
432
+ "step": 1475
433
+ },
434
+ {
435
+ "epoch": 0.242600679281902,
436
+ "grad_norm": 0.24832689762115479,
437
+ "learning_rate": 7.78e-06,
438
+ "loss": 0.02,
439
+ "step": 1500
440
+ },
441
+ {
442
+ "epoch": 0.24664402393660034,
443
+ "grad_norm": 0.22871288657188416,
444
+ "learning_rate": 7.724444444444446e-06,
445
+ "loss": 0.0193,
446
+ "step": 1525
447
+ },
448
+ {
449
+ "epoch": 0.2506873685912987,
450
+ "grad_norm": 0.2521936297416687,
451
+ "learning_rate": 7.66888888888889e-06,
452
+ "loss": 0.0197,
453
+ "step": 1550
454
+ },
455
+ {
456
+ "epoch": 0.2547307132459971,
457
+ "grad_norm": 0.24429504573345184,
458
+ "learning_rate": 7.613333333333334e-06,
459
+ "loss": 0.0221,
460
+ "step": 1575
461
+ },
462
+ {
463
+ "epoch": 0.25877405790069546,
464
+ "grad_norm": 0.23084107041358948,
465
+ "learning_rate": 7.557777777777779e-06,
466
+ "loss": 0.02,
467
+ "step": 1600
468
+ },
469
+ {
470
+ "epoch": 0.2628174025553938,
471
+ "grad_norm": 0.2697436809539795,
472
+ "learning_rate": 7.502222222222223e-06,
473
+ "loss": 0.0211,
474
+ "step": 1625
475
+ },
476
+ {
477
+ "epoch": 0.2668607472100922,
478
+ "grad_norm": 0.23871037364006042,
479
+ "learning_rate": 7.446666666666668e-06,
480
+ "loss": 0.0214,
481
+ "step": 1650
482
+ },
483
+ {
484
+ "epoch": 0.27090409186479053,
485
+ "grad_norm": 0.2677021920681,
486
+ "learning_rate": 7.3911111111111125e-06,
487
+ "loss": 0.0203,
488
+ "step": 1675
489
+ },
490
+ {
491
+ "epoch": 0.27494743651948894,
492
+ "grad_norm": 0.1928839236497879,
493
+ "learning_rate": 7.335555555555556e-06,
494
+ "loss": 0.0203,
495
+ "step": 1700
496
+ },
497
+ {
498
+ "epoch": 0.2789907811741873,
499
+ "grad_norm": 0.26213887333869934,
500
+ "learning_rate": 7.280000000000001e-06,
501
+ "loss": 0.0205,
502
+ "step": 1725
503
+ },
504
+ {
505
+ "epoch": 0.28303412582888565,
506
+ "grad_norm": 0.23492328822612762,
507
+ "learning_rate": 7.224444444444445e-06,
508
+ "loss": 0.0201,
509
+ "step": 1750
510
+ },
511
+ {
512
+ "epoch": 0.287077470483584,
513
+ "grad_norm": 0.23946373164653778,
514
+ "learning_rate": 7.1688888888888895e-06,
515
+ "loss": 0.0195,
516
+ "step": 1775
517
+ },
518
+ {
519
+ "epoch": 0.29112081513828236,
520
+ "grad_norm": 0.20817314088344574,
521
+ "learning_rate": 7.113333333333334e-06,
522
+ "loss": 0.0211,
523
+ "step": 1800
524
+ },
525
+ {
526
+ "epoch": 0.2951641597929808,
527
+ "grad_norm": 0.22937080264091492,
528
+ "learning_rate": 7.057777777777778e-06,
529
+ "loss": 0.0203,
530
+ "step": 1825
531
+ },
532
+ {
533
+ "epoch": 0.29920750444767913,
534
+ "grad_norm": 0.23442518711090088,
535
+ "learning_rate": 7.0022222222222225e-06,
536
+ "loss": 0.0197,
537
+ "step": 1850
538
+ },
539
+ {
540
+ "epoch": 0.3032508491023775,
541
+ "grad_norm": 0.24289068579673767,
542
+ "learning_rate": 6.946666666666667e-06,
543
+ "loss": 0.0198,
544
+ "step": 1875
545
+ },
546
+ {
547
+ "epoch": 0.30729419375707584,
548
+ "grad_norm": 0.24261179566383362,
549
+ "learning_rate": 6.891111111111111e-06,
550
+ "loss": 0.0211,
551
+ "step": 1900
552
+ },
553
+ {
554
+ "epoch": 0.3113375384117742,
555
+ "grad_norm": 0.21574310958385468,
556
+ "learning_rate": 6.835555555555556e-06,
557
+ "loss": 0.0196,
558
+ "step": 1925
559
+ },
560
+ {
561
+ "epoch": 0.3153808830664726,
562
+ "grad_norm": 0.2648760676383972,
563
+ "learning_rate": 6.780000000000001e-06,
564
+ "loss": 0.021,
565
+ "step": 1950
566
+ },
567
+ {
568
+ "epoch": 0.31942422772117096,
569
+ "grad_norm": 0.22739122807979584,
570
+ "learning_rate": 6.724444444444444e-06,
571
+ "loss": 0.019,
572
+ "step": 1975
573
+ },
574
+ {
575
+ "epoch": 0.3234675723758693,
576
+ "grad_norm": 0.2167060226202011,
577
+ "learning_rate": 6.668888888888889e-06,
578
+ "loss": 0.0208,
579
+ "step": 2000
580
+ },
581
+ {
582
+ "epoch": 0.3234675723758693,
583
+ "eval_loss": 0.025321291759610176,
584
+ "eval_runtime": 14275.0619,
585
+ "eval_samples_per_second": 8.63,
586
+ "eval_steps_per_second": 0.539,
587
+ "eval_wer": 82.5098389256299,
588
+ "step": 2000
589
+ },
590
+ {
591
+ "epoch": 0.32751091703056767,
592
+ "grad_norm": 0.2261808067560196,
593
+ "learning_rate": 6.613333333333334e-06,
594
+ "loss": 0.0192,
595
+ "step": 2025
596
+ },
597
+ {
598
+ "epoch": 0.331554261685266,
599
+ "grad_norm": 0.26066452264785767,
600
+ "learning_rate": 6.557777777777778e-06,
601
+ "loss": 0.0222,
602
+ "step": 2050
603
+ },
604
+ {
605
+ "epoch": 0.33559760633996444,
606
+ "grad_norm": 0.22417497634887695,
607
+ "learning_rate": 6.502222222222223e-06,
608
+ "loss": 0.0204,
609
+ "step": 2075
610
+ },
611
+ {
612
+ "epoch": 0.3396409509946628,
613
+ "grad_norm": 0.20988379418849945,
614
+ "learning_rate": 6.446666666666668e-06,
615
+ "loss": 0.0191,
616
+ "step": 2100
617
+ },
618
+ {
619
+ "epoch": 0.34368429564936115,
620
+ "grad_norm": 0.22671236097812653,
621
+ "learning_rate": 6.391111111111111e-06,
622
+ "loss": 0.0211,
623
+ "step": 2125
624
+ },
625
+ {
626
+ "epoch": 0.3477276403040595,
627
+ "grad_norm": 0.2695440351963043,
628
+ "learning_rate": 6.335555555555556e-06,
629
+ "loss": 0.0197,
630
+ "step": 2150
631
+ },
632
+ {
633
+ "epoch": 0.35177098495875786,
634
+ "grad_norm": 0.20911026000976562,
635
+ "learning_rate": 6.280000000000001e-06,
636
+ "loss": 0.0191,
637
+ "step": 2175
638
+ },
639
+ {
640
+ "epoch": 0.35581432961345627,
641
+ "grad_norm": 0.21661245822906494,
642
+ "learning_rate": 6.224444444444445e-06,
643
+ "loss": 0.0199,
644
+ "step": 2200
645
+ },
646
+ {
647
+ "epoch": 0.3598576742681546,
648
+ "grad_norm": 0.222182035446167,
649
+ "learning_rate": 6.16888888888889e-06,
650
+ "loss": 0.0195,
651
+ "step": 2225
652
+ },
653
+ {
654
+ "epoch": 0.363901018922853,
655
+ "grad_norm": 0.2006169855594635,
656
+ "learning_rate": 6.113333333333333e-06,
657
+ "loss": 0.0199,
658
+ "step": 2250
659
+ },
660
+ {
661
+ "epoch": 0.36794436357755134,
662
+ "grad_norm": 0.22298727929592133,
663
+ "learning_rate": 6.057777777777778e-06,
664
+ "loss": 0.02,
665
+ "step": 2275
666
+ },
667
+ {
668
+ "epoch": 0.3719877082322497,
669
+ "grad_norm": 0.19001857936382294,
670
+ "learning_rate": 6.002222222222223e-06,
671
+ "loss": 0.0197,
672
+ "step": 2300
673
+ },
674
+ {
675
+ "epoch": 0.3760310528869481,
676
+ "grad_norm": 0.2434571385383606,
677
+ "learning_rate": 5.946666666666668e-06,
678
+ "loss": 0.021,
679
+ "step": 2325
680
+ },
681
+ {
682
+ "epoch": 0.38007439754164646,
683
+ "grad_norm": 0.21516510844230652,
684
+ "learning_rate": 5.891111111111112e-06,
685
+ "loss": 0.0193,
686
+ "step": 2350
687
+ },
688
+ {
689
+ "epoch": 0.3841177421963448,
690
+ "grad_norm": 0.24581994116306305,
691
+ "learning_rate": 5.8355555555555565e-06,
692
+ "loss": 0.0199,
693
+ "step": 2375
694
+ },
695
+ {
696
+ "epoch": 0.38816108685104317,
697
+ "grad_norm": 0.23675447702407837,
698
+ "learning_rate": 5.78e-06,
699
+ "loss": 0.0195,
700
+ "step": 2400
701
+ },
702
+ {
703
+ "epoch": 0.3922044315057415,
704
+ "grad_norm": 0.21948282420635223,
705
+ "learning_rate": 5.724444444444445e-06,
706
+ "loss": 0.0194,
707
+ "step": 2425
708
+ },
709
+ {
710
+ "epoch": 0.39624777616043994,
711
+ "grad_norm": 0.21700553596019745,
712
+ "learning_rate": 5.6688888888888895e-06,
713
+ "loss": 0.0188,
714
+ "step": 2450
715
+ },
716
+ {
717
+ "epoch": 0.4002911208151383,
718
+ "grad_norm": 0.24019096791744232,
719
+ "learning_rate": 5.613333333333334e-06,
720
+ "loss": 0.0211,
721
+ "step": 2475
722
+ },
723
+ {
724
+ "epoch": 0.40433446546983665,
725
+ "grad_norm": 0.21174991130828857,
726
+ "learning_rate": 5.557777777777778e-06,
727
+ "loss": 0.0207,
728
+ "step": 2500
729
+ },
730
+ {
731
+ "epoch": 0.408377810124535,
732
+ "grad_norm": 0.2548556625843048,
733
+ "learning_rate": 5.5022222222222224e-06,
734
+ "loss": 0.0202,
735
+ "step": 2525
736
+ },
737
+ {
738
+ "epoch": 0.41242115477923336,
739
+ "grad_norm": 0.2106471061706543,
740
+ "learning_rate": 5.4466666666666665e-06,
741
+ "loss": 0.0206,
742
+ "step": 2550
743
+ },
744
+ {
745
+ "epoch": 0.41646449943393177,
746
+ "grad_norm": 0.20076259970664978,
747
+ "learning_rate": 5.391111111111111e-06,
748
+ "loss": 0.0188,
749
+ "step": 2575
750
+ },
751
+ {
752
+ "epoch": 0.4205078440886301,
753
+ "grad_norm": 0.20281557738780975,
754
+ "learning_rate": 5.335555555555556e-06,
755
+ "loss": 0.0189,
756
+ "step": 2600
757
+ },
758
+ {
759
+ "epoch": 0.4245511887433285,
760
+ "grad_norm": 0.23585236072540283,
761
+ "learning_rate": 5.28e-06,
762
+ "loss": 0.0193,
763
+ "step": 2625
764
+ },
765
+ {
766
+ "epoch": 0.42859453339802683,
767
+ "grad_norm": 0.20076635479927063,
768
+ "learning_rate": 5.224444444444445e-06,
769
+ "loss": 0.0188,
770
+ "step": 2650
771
+ },
772
+ {
773
+ "epoch": 0.4326378780527252,
774
+ "grad_norm": 0.20363172888755798,
775
+ "learning_rate": 5.168888888888889e-06,
776
+ "loss": 0.0188,
777
+ "step": 2675
778
+ },
779
+ {
780
+ "epoch": 0.4366812227074236,
781
+ "grad_norm": 0.20968079566955566,
782
+ "learning_rate": 5.113333333333333e-06,
783
+ "loss": 0.0198,
784
+ "step": 2700
785
+ },
786
+ {
787
+ "epoch": 0.44072456736212196,
788
+ "grad_norm": 0.2075379192829132,
789
+ "learning_rate": 5.057777777777778e-06,
790
+ "loss": 0.0192,
791
+ "step": 2725
792
+ },
793
+ {
794
+ "epoch": 0.4447679120168203,
795
+ "grad_norm": 0.19762763381004333,
796
+ "learning_rate": 5.002222222222223e-06,
797
+ "loss": 0.0192,
798
+ "step": 2750
799
+ },
800
+ {
801
+ "epoch": 0.44881125667151867,
802
+ "grad_norm": 0.2090006172657013,
803
+ "learning_rate": 4.946666666666667e-06,
804
+ "loss": 0.0196,
805
+ "step": 2775
806
+ },
807
+ {
808
+ "epoch": 0.452854601326217,
809
+ "grad_norm": 0.20064634084701538,
810
+ "learning_rate": 4.891111111111111e-06,
811
+ "loss": 0.0187,
812
+ "step": 2800
813
+ },
814
+ {
815
+ "epoch": 0.45689794598091543,
816
+ "grad_norm": 0.20843106508255005,
817
+ "learning_rate": 4.835555555555556e-06,
818
+ "loss": 0.0192,
819
+ "step": 2825
820
+ },
821
+ {
822
+ "epoch": 0.4609412906356138,
823
+ "grad_norm": 0.22503884136676788,
824
+ "learning_rate": 4.78e-06,
825
+ "loss": 0.0195,
826
+ "step": 2850
827
+ },
828
+ {
829
+ "epoch": 0.46498463529031214,
830
+ "grad_norm": 0.22823262214660645,
831
+ "learning_rate": 4.724444444444445e-06,
832
+ "loss": 0.0199,
833
+ "step": 2875
834
+ },
835
+ {
836
+ "epoch": 0.4690279799450105,
837
+ "grad_norm": 0.24186566472053528,
838
+ "learning_rate": 4.66888888888889e-06,
839
+ "loss": 0.0194,
840
+ "step": 2900
841
+ },
842
+ {
843
+ "epoch": 0.47307132459970885,
844
+ "grad_norm": 0.27571889758110046,
845
+ "learning_rate": 4.613333333333334e-06,
846
+ "loss": 0.0194,
847
+ "step": 2925
848
+ },
849
+ {
850
+ "epoch": 0.47711466925440726,
851
+ "grad_norm": 0.22955967485904694,
852
+ "learning_rate": 4.557777777777778e-06,
853
+ "loss": 0.0201,
854
+ "step": 2950
855
+ },
856
+ {
857
+ "epoch": 0.4811580139091056,
858
+ "grad_norm": 0.21334044635295868,
859
+ "learning_rate": 4.502222222222223e-06,
860
+ "loss": 0.0204,
861
+ "step": 2975
862
+ },
863
+ {
864
+ "epoch": 0.485201358563804,
865
+ "grad_norm": 0.20334798097610474,
866
+ "learning_rate": 4.446666666666667e-06,
867
+ "loss": 0.0207,
868
+ "step": 3000
869
+ },
870
+ {
871
+ "epoch": 0.485201358563804,
872
+ "eval_loss": 0.024814918637275696,
873
+ "eval_runtime": 14244.4286,
874
+ "eval_samples_per_second": 8.649,
875
+ "eval_steps_per_second": 0.541,
876
+ "eval_wer": 81.96210492149146,
877
+ "step": 3000
878
+ }
879
+ ],
880
+ "logging_steps": 25,
881
+ "max_steps": 5000,
882
+ "num_input_tokens_seen": 0,
883
+ "num_train_epochs": 1,
884
+ "save_steps": 1000,
885
+ "stateful_callbacks": {
886
+ "TrainerControl": {
887
+ "args": {
888
+ "should_epoch_stop": false,
889
+ "should_evaluate": false,
890
+ "should_log": false,
891
+ "should_save": true,
892
+ "should_training_stop": false
893
+ },
894
+ "attributes": {}
895
+ }
896
+ },
897
+ "total_flos": 2.770419843072e+19,
898
+ "train_batch_size": 32,
899
+ "trial_name": null,
900
+ "trial_params": null
901
+ }
checkpoint-3000/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f9f7caca91e9e74c1f1f32c6f743e7a01e94bbb4a987606d363be4bc023f04fb
3
+ size 5496
checkpoint-4000/config.json ADDED
@@ -0,0 +1,60 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "activation_dropout": 0.0,
3
+ "activation_function": "gelu",
4
+ "apply_spec_augment": false,
5
+ "architectures": [
6
+ "WhisperForConditionalGeneration"
7
+ ],
8
+ "attention_dropout": 0.0,
9
+ "begin_suppress_tokens": null,
10
+ "bos_token_id": 50257,
11
+ "classifier_proj_size": 256,
12
+ "d_model": 768,
13
+ "decoder_attention_heads": 12,
14
+ "decoder_ffn_dim": 3072,
15
+ "decoder_layerdrop": 0.0,
16
+ "decoder_layers": 12,
17
+ "decoder_start_token_id": 50258,
18
+ "dropout": 0.0,
19
+ "encoder_attention_heads": 12,
20
+ "encoder_ffn_dim": 3072,
21
+ "encoder_layerdrop": 0.0,
22
+ "encoder_layers": 12,
23
+ "eos_token_id": 50257,
24
+ "forced_decoder_ids": [
25
+ [
26
+ 1,
27
+ 50259
28
+ ],
29
+ [
30
+ 2,
31
+ 50359
32
+ ],
33
+ [
34
+ 3,
35
+ 50363
36
+ ]
37
+ ],
38
+ "init_std": 0.02,
39
+ "is_encoder_decoder": true,
40
+ "mask_feature_length": 10,
41
+ "mask_feature_min_masks": 0,
42
+ "mask_feature_prob": 0.0,
43
+ "mask_time_length": 10,
44
+ "mask_time_min_masks": 2,
45
+ "mask_time_prob": 0.05,
46
+ "max_length": null,
47
+ "max_source_positions": 1500,
48
+ "max_target_positions": 448,
49
+ "median_filter_width": 7,
50
+ "model_type": "whisper",
51
+ "num_hidden_layers": 12,
52
+ "num_mel_bins": 80,
53
+ "pad_token_id": 50257,
54
+ "scale_embedding": false,
55
+ "torch_dtype": "float32",
56
+ "transformers_version": "4.51.3",
57
+ "use_cache": true,
58
+ "use_weighted_layer_sum": false,
59
+ "vocab_size": 51865
60
+ }
checkpoint-4000/generation_config.json ADDED
@@ -0,0 +1,254 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alignment_heads": [
3
+ [
4
+ 5,
5
+ 3
6
+ ],
7
+ [
8
+ 5,
9
+ 9
10
+ ],
11
+ [
12
+ 8,
13
+ 0
14
+ ],
15
+ [
16
+ 8,
17
+ 4
18
+ ],
19
+ [
20
+ 8,
21
+ 7
22
+ ],
23
+ [
24
+ 8,
25
+ 8
26
+ ],
27
+ [
28
+ 9,
29
+ 0
30
+ ],
31
+ [
32
+ 9,
33
+ 7
34
+ ],
35
+ [
36
+ 9,
37
+ 9
38
+ ],
39
+ [
40
+ 10,
41
+ 5
42
+ ]
43
+ ],
44
+ "begin_suppress_tokens": [
45
+ 220,
46
+ 50257
47
+ ],
48
+ "bos_token_id": 50257,
49
+ "decoder_start_token_id": 50258,
50
+ "eos_token_id": 50257,
51
+ "is_multilingual": true,
52
+ "lang_to_id": {
53
+ "<|af|>": 50327,
54
+ "<|am|>": 50334,
55
+ "<|ar|>": 50272,
56
+ "<|as|>": 50350,
57
+ "<|az|>": 50304,
58
+ "<|ba|>": 50355,
59
+ "<|be|>": 50330,
60
+ "<|bg|>": 50292,
61
+ "<|bn|>": 50302,
62
+ "<|bo|>": 50347,
63
+ "<|br|>": 50309,
64
+ "<|bs|>": 50315,
65
+ "<|ca|>": 50270,
66
+ "<|cs|>": 50283,
67
+ "<|cy|>": 50297,
68
+ "<|da|>": 50285,
69
+ "<|de|>": 50261,
70
+ "<|el|>": 50281,
71
+ "<|en|>": 50259,
72
+ "<|es|>": 50262,
73
+ "<|et|>": 50307,
74
+ "<|eu|>": 50310,
75
+ "<|fa|>": 50300,
76
+ "<|fi|>": 50277,
77
+ "<|fo|>": 50338,
78
+ "<|fr|>": 50265,
79
+ "<|gl|>": 50319,
80
+ "<|gu|>": 50333,
81
+ "<|haw|>": 50352,
82
+ "<|ha|>": 50354,
83
+ "<|he|>": 50279,
84
+ "<|hi|>": 50276,
85
+ "<|hr|>": 50291,
86
+ "<|ht|>": 50339,
87
+ "<|hu|>": 50286,
88
+ "<|hy|>": 50312,
89
+ "<|id|>": 50275,
90
+ "<|is|>": 50311,
91
+ "<|it|>": 50274,
92
+ "<|ja|>": 50266,
93
+ "<|jw|>": 50356,
94
+ "<|ka|>": 50329,
95
+ "<|kk|>": 50316,
96
+ "<|km|>": 50323,
97
+ "<|kn|>": 50306,
98
+ "<|ko|>": 50264,
99
+ "<|la|>": 50294,
100
+ "<|lb|>": 50345,
101
+ "<|ln|>": 50353,
102
+ "<|lo|>": 50336,
103
+ "<|lt|>": 50293,
104
+ "<|lv|>": 50301,
105
+ "<|mg|>": 50349,
106
+ "<|mi|>": 50295,
107
+ "<|mk|>": 50308,
108
+ "<|ml|>": 50296,
109
+ "<|mn|>": 50314,
110
+ "<|mr|>": 50320,
111
+ "<|ms|>": 50282,
112
+ "<|mt|>": 50343,
113
+ "<|my|>": 50346,
114
+ "<|ne|>": 50313,
115
+ "<|nl|>": 50271,
116
+ "<|nn|>": 50342,
117
+ "<|no|>": 50288,
118
+ "<|oc|>": 50328,
119
+ "<|pa|>": 50321,
120
+ "<|pl|>": 50269,
121
+ "<|ps|>": 50340,
122
+ "<|pt|>": 50267,
123
+ "<|ro|>": 50284,
124
+ "<|ru|>": 50263,
125
+ "<|sa|>": 50344,
126
+ "<|sd|>": 50332,
127
+ "<|si|>": 50322,
128
+ "<|sk|>": 50298,
129
+ "<|sl|>": 50305,
130
+ "<|sn|>": 50324,
131
+ "<|so|>": 50326,
132
+ "<|sq|>": 50317,
133
+ "<|sr|>": 50303,
134
+ "<|su|>": 50357,
135
+ "<|sv|>": 50273,
136
+ "<|sw|>": 50318,
137
+ "<|ta|>": 50287,
138
+ "<|te|>": 50299,
139
+ "<|tg|>": 50331,
140
+ "<|th|>": 50289,
141
+ "<|tk|>": 50341,
142
+ "<|tl|>": 50348,
143
+ "<|tr|>": 50268,
144
+ "<|tt|>": 50351,
145
+ "<|uk|>": 50280,
146
+ "<|ur|>": 50290,
147
+ "<|uz|>": 50337,
148
+ "<|vi|>": 50278,
149
+ "<|yi|>": 50335,
150
+ "<|yo|>": 50325,
151
+ "<|zh|>": 50260
152
+ },
153
+ "language": "chinese",
154
+ "max_initial_timestamp_index": 50,
155
+ "max_length": 448,
156
+ "no_timestamps_token_id": 50363,
157
+ "pad_token_id": 50257,
158
+ "prev_sot_token_id": 50361,
159
+ "return_timestamps": false,
160
+ "suppress_tokens": [
161
+ 1,
162
+ 2,
163
+ 7,
164
+ 8,
165
+ 9,
166
+ 10,
167
+ 14,
168
+ 25,
169
+ 26,
170
+ 27,
171
+ 28,
172
+ 29,
173
+ 31,
174
+ 58,
175
+ 59,
176
+ 60,
177
+ 61,
178
+ 62,
179
+ 63,
180
+ 90,
181
+ 91,
182
+ 92,
183
+ 93,
184
+ 359,
185
+ 503,
186
+ 522,
187
+ 542,
188
+ 873,
189
+ 893,
190
+ 902,
191
+ 918,
192
+ 922,
193
+ 931,
194
+ 1350,
195
+ 1853,
196
+ 1982,
197
+ 2460,
198
+ 2627,
199
+ 3246,
200
+ 3253,
201
+ 3268,
202
+ 3536,
203
+ 3846,
204
+ 3961,
205
+ 4183,
206
+ 4667,
207
+ 6585,
208
+ 6647,
209
+ 7273,
210
+ 9061,
211
+ 9383,
212
+ 10428,
213
+ 10929,
214
+ 11938,
215
+ 12033,
216
+ 12331,
217
+ 12562,
218
+ 13793,
219
+ 14157,
220
+ 14635,
221
+ 15265,
222
+ 15618,
223
+ 16553,
224
+ 16604,
225
+ 18362,
226
+ 18956,
227
+ 20075,
228
+ 21675,
229
+ 22520,
230
+ 26130,
231
+ 26161,
232
+ 26435,
233
+ 28279,
234
+ 29464,
235
+ 31650,
236
+ 32302,
237
+ 32470,
238
+ 36865,
239
+ 42863,
240
+ 47425,
241
+ 49870,
242
+ 50254,
243
+ 50258,
244
+ 50360,
245
+ 50361,
246
+ 50362
247
+ ],
248
+ "task": "transcribe",
249
+ "task_to_id": {
250
+ "transcribe": 50359,
251
+ "translate": 50358
252
+ },
253
+ "transformers_version": "4.51.3"
254
+ }
checkpoint-4000/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8d7fe4ad804df35cb408f691c326f763852355059bf6d7f95cdb9de1505a5a24
3
+ size 966995080
checkpoint-4000/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fcf737e09304f327d6e849c7e1c6d5ce0316aa8f403192e1d8182dc79a628c0c
3
+ size 1925064044
checkpoint-4000/preprocessor_config.json ADDED
@@ -0,0 +1,15 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "chunk_length": 30,
3
+ "dither": 0.0,
4
+ "feature_extractor_type": "WhisperFeatureExtractor",
5
+ "feature_size": 80,
6
+ "hop_length": 160,
7
+ "n_fft": 400,
8
+ "n_samples": 480000,
9
+ "nb_max_frames": 3000,
10
+ "padding_side": "right",
11
+ "padding_value": 0.0,
12
+ "processor_class": "WhisperProcessor",
13
+ "return_attention_mask": false,
14
+ "sampling_rate": 16000
15
+ }
checkpoint-4000/rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:bb5799f0d563f7b73656e801beed6baf8f64ab1754dffca6d57b85481a7eb816
3
+ size 14244
checkpoint-4000/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4a76e193687c482f6cc875caf45cbf094edc541bfbe3eb9f8259fd2d597d2f4e
3
+ size 1064
checkpoint-4000/trainer_state.json ADDED
@@ -0,0 +1,1190 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_global_step": 4000,
3
+ "best_metric": 81.67647178155556,
4
+ "best_model_checkpoint": "./working_area/output_model/checkpoint-4000",
5
+ "epoch": 0.6469351447517386,
6
+ "eval_steps": 1000,
7
+ "global_step": 4000,
8
+ "is_hyper_param_search": false,
9
+ "is_local_process_zero": true,
10
+ "is_world_process_zero": true,
11
+ "log_history": [
12
+ {
13
+ "epoch": 0.004043344654698367,
14
+ "grad_norm": 10.67656421661377,
15
+ "learning_rate": 4.800000000000001e-07,
16
+ "loss": 0.2728,
17
+ "step": 25
18
+ },
19
+ {
20
+ "epoch": 0.008086689309396733,
21
+ "grad_norm": 0.34478363394737244,
22
+ "learning_rate": 9.800000000000001e-07,
23
+ "loss": 0.0358,
24
+ "step": 50
25
+ },
26
+ {
27
+ "epoch": 0.012130033964095099,
28
+ "grad_norm": 0.26953190565109253,
29
+ "learning_rate": 1.48e-06,
30
+ "loss": 0.0261,
31
+ "step": 75
32
+ },
33
+ {
34
+ "epoch": 0.016173378618793467,
35
+ "grad_norm": 0.2419331669807434,
36
+ "learning_rate": 1.98e-06,
37
+ "loss": 0.0251,
38
+ "step": 100
39
+ },
40
+ {
41
+ "epoch": 0.020216723273491832,
42
+ "grad_norm": 0.29993513226509094,
43
+ "learning_rate": 2.4800000000000004e-06,
44
+ "loss": 0.0247,
45
+ "step": 125
46
+ },
47
+ {
48
+ "epoch": 0.024260067928190198,
49
+ "grad_norm": 0.2362569123506546,
50
+ "learning_rate": 2.9800000000000003e-06,
51
+ "loss": 0.0227,
52
+ "step": 150
53
+ },
54
+ {
55
+ "epoch": 0.028303412582888564,
56
+ "grad_norm": 0.2520463466644287,
57
+ "learning_rate": 3.48e-06,
58
+ "loss": 0.0246,
59
+ "step": 175
60
+ },
61
+ {
62
+ "epoch": 0.03234675723758693,
63
+ "grad_norm": 0.2729567885398865,
64
+ "learning_rate": 3.980000000000001e-06,
65
+ "loss": 0.0236,
66
+ "step": 200
67
+ },
68
+ {
69
+ "epoch": 0.036390101892285295,
70
+ "grad_norm": 0.21665456891059875,
71
+ "learning_rate": 4.48e-06,
72
+ "loss": 0.0242,
73
+ "step": 225
74
+ },
75
+ {
76
+ "epoch": 0.040433446546983665,
77
+ "grad_norm": 0.27291786670684814,
78
+ "learning_rate": 4.980000000000001e-06,
79
+ "loss": 0.0242,
80
+ "step": 250
81
+ },
82
+ {
83
+ "epoch": 0.044476791201682034,
84
+ "grad_norm": 0.20698551833629608,
85
+ "learning_rate": 5.480000000000001e-06,
86
+ "loss": 0.0229,
87
+ "step": 275
88
+ },
89
+ {
90
+ "epoch": 0.048520135856380396,
91
+ "grad_norm": 0.2506471276283264,
92
+ "learning_rate": 5.98e-06,
93
+ "loss": 0.0225,
94
+ "step": 300
95
+ },
96
+ {
97
+ "epoch": 0.052563480511078765,
98
+ "grad_norm": 0.24540211260318756,
99
+ "learning_rate": 6.480000000000001e-06,
100
+ "loss": 0.0231,
101
+ "step": 325
102
+ },
103
+ {
104
+ "epoch": 0.05660682516577713,
105
+ "grad_norm": 0.2298947423696518,
106
+ "learning_rate": 6.98e-06,
107
+ "loss": 0.0223,
108
+ "step": 350
109
+ },
110
+ {
111
+ "epoch": 0.0606501698204755,
112
+ "grad_norm": 0.19595475494861603,
113
+ "learning_rate": 7.48e-06,
114
+ "loss": 0.022,
115
+ "step": 375
116
+ },
117
+ {
118
+ "epoch": 0.06469351447517387,
119
+ "grad_norm": 0.2605821490287781,
120
+ "learning_rate": 7.980000000000002e-06,
121
+ "loss": 0.022,
122
+ "step": 400
123
+ },
124
+ {
125
+ "epoch": 0.06873685912987224,
126
+ "grad_norm": 0.25346869230270386,
127
+ "learning_rate": 8.48e-06,
128
+ "loss": 0.0226,
129
+ "step": 425
130
+ },
131
+ {
132
+ "epoch": 0.07278020378457059,
133
+ "grad_norm": 0.255500465631485,
134
+ "learning_rate": 8.98e-06,
135
+ "loss": 0.0216,
136
+ "step": 450
137
+ },
138
+ {
139
+ "epoch": 0.07682354843926896,
140
+ "grad_norm": 0.24462725222110748,
141
+ "learning_rate": 9.48e-06,
142
+ "loss": 0.0234,
143
+ "step": 475
144
+ },
145
+ {
146
+ "epoch": 0.08086689309396733,
147
+ "grad_norm": 0.23454348742961884,
148
+ "learning_rate": 9.980000000000001e-06,
149
+ "loss": 0.024,
150
+ "step": 500
151
+ },
152
+ {
153
+ "epoch": 0.0849102377486657,
154
+ "grad_norm": 0.2627425193786621,
155
+ "learning_rate": 9.946666666666667e-06,
156
+ "loss": 0.0226,
157
+ "step": 525
158
+ },
159
+ {
160
+ "epoch": 0.08895358240336407,
161
+ "grad_norm": 0.22754116356372833,
162
+ "learning_rate": 9.891111111111113e-06,
163
+ "loss": 0.023,
164
+ "step": 550
165
+ },
166
+ {
167
+ "epoch": 0.09299692705806242,
168
+ "grad_norm": 0.26447415351867676,
169
+ "learning_rate": 9.835555555555556e-06,
170
+ "loss": 0.0215,
171
+ "step": 575
172
+ },
173
+ {
174
+ "epoch": 0.09704027171276079,
175
+ "grad_norm": 0.27743828296661377,
176
+ "learning_rate": 9.780000000000001e-06,
177
+ "loss": 0.0215,
178
+ "step": 600
179
+ },
180
+ {
181
+ "epoch": 0.10108361636745916,
182
+ "grad_norm": 0.23157595098018646,
183
+ "learning_rate": 9.724444444444445e-06,
184
+ "loss": 0.023,
185
+ "step": 625
186
+ },
187
+ {
188
+ "epoch": 0.10512696102215753,
189
+ "grad_norm": 0.2561393082141876,
190
+ "learning_rate": 9.66888888888889e-06,
191
+ "loss": 0.0209,
192
+ "step": 650
193
+ },
194
+ {
195
+ "epoch": 0.1091703056768559,
196
+ "grad_norm": 0.2518314719200134,
197
+ "learning_rate": 9.613333333333335e-06,
198
+ "loss": 0.0235,
199
+ "step": 675
200
+ },
201
+ {
202
+ "epoch": 0.11321365033155426,
203
+ "grad_norm": 0.2346208244562149,
204
+ "learning_rate": 9.557777777777777e-06,
205
+ "loss": 0.0222,
206
+ "step": 700
207
+ },
208
+ {
209
+ "epoch": 0.11725699498625262,
210
+ "grad_norm": 0.23254457116127014,
211
+ "learning_rate": 9.502222222222223e-06,
212
+ "loss": 0.0216,
213
+ "step": 725
214
+ },
215
+ {
216
+ "epoch": 0.121300339640951,
217
+ "grad_norm": 0.2728439271450043,
218
+ "learning_rate": 9.446666666666667e-06,
219
+ "loss": 0.0208,
220
+ "step": 750
221
+ },
222
+ {
223
+ "epoch": 0.12534368429564935,
224
+ "grad_norm": 0.24401770532131195,
225
+ "learning_rate": 9.391111111111111e-06,
226
+ "loss": 0.0222,
227
+ "step": 775
228
+ },
229
+ {
230
+ "epoch": 0.12938702895034773,
231
+ "grad_norm": 0.2546014189720154,
232
+ "learning_rate": 9.335555555555557e-06,
233
+ "loss": 0.0223,
234
+ "step": 800
235
+ },
236
+ {
237
+ "epoch": 0.1334303736050461,
238
+ "grad_norm": 0.20592975616455078,
239
+ "learning_rate": 9.280000000000001e-06,
240
+ "loss": 0.0214,
241
+ "step": 825
242
+ },
243
+ {
244
+ "epoch": 0.13747371825974447,
245
+ "grad_norm": 0.2854577898979187,
246
+ "learning_rate": 9.224444444444445e-06,
247
+ "loss": 0.021,
248
+ "step": 850
249
+ },
250
+ {
251
+ "epoch": 0.14151706291444283,
252
+ "grad_norm": 0.2555174231529236,
253
+ "learning_rate": 9.168888888888889e-06,
254
+ "loss": 0.0216,
255
+ "step": 875
256
+ },
257
+ {
258
+ "epoch": 0.14556040756914118,
259
+ "grad_norm": 0.22724099457263947,
260
+ "learning_rate": 9.113333333333335e-06,
261
+ "loss": 0.0226,
262
+ "step": 900
263
+ },
264
+ {
265
+ "epoch": 0.14960375222383956,
266
+ "grad_norm": 0.24628663063049316,
267
+ "learning_rate": 9.057777777777779e-06,
268
+ "loss": 0.021,
269
+ "step": 925
270
+ },
271
+ {
272
+ "epoch": 0.15364709687853792,
273
+ "grad_norm": 0.22158333659172058,
274
+ "learning_rate": 9.002222222222223e-06,
275
+ "loss": 0.0221,
276
+ "step": 950
277
+ },
278
+ {
279
+ "epoch": 0.1576904415332363,
280
+ "grad_norm": 0.23985975980758667,
281
+ "learning_rate": 8.946666666666669e-06,
282
+ "loss": 0.0227,
283
+ "step": 975
284
+ },
285
+ {
286
+ "epoch": 0.16173378618793466,
287
+ "grad_norm": 0.24263052642345428,
288
+ "learning_rate": 8.891111111111111e-06,
289
+ "loss": 0.021,
290
+ "step": 1000
291
+ },
292
+ {
293
+ "epoch": 0.16173378618793466,
294
+ "eval_loss": 0.026818539947271347,
295
+ "eval_runtime": 15594.6052,
296
+ "eval_samples_per_second": 7.9,
297
+ "eval_steps_per_second": 0.494,
298
+ "eval_wer": 84.01915040370025,
299
+ "step": 1000
300
+ },
301
+ {
302
+ "epoch": 0.165777130842633,
303
+ "grad_norm": 0.24622474610805511,
304
+ "learning_rate": 8.835555555555557e-06,
305
+ "loss": 0.0215,
306
+ "step": 1025
307
+ },
308
+ {
309
+ "epoch": 0.1698204754973314,
310
+ "grad_norm": 0.218623086810112,
311
+ "learning_rate": 8.78e-06,
312
+ "loss": 0.0209,
313
+ "step": 1050
314
+ },
315
+ {
316
+ "epoch": 0.17386382015202975,
317
+ "grad_norm": 0.2270510345697403,
318
+ "learning_rate": 8.724444444444445e-06,
319
+ "loss": 0.0216,
320
+ "step": 1075
321
+ },
322
+ {
323
+ "epoch": 0.17790716480672814,
324
+ "grad_norm": 0.2207878828048706,
325
+ "learning_rate": 8.66888888888889e-06,
326
+ "loss": 0.022,
327
+ "step": 1100
328
+ },
329
+ {
330
+ "epoch": 0.1819505094614265,
331
+ "grad_norm": 0.18590456247329712,
332
+ "learning_rate": 8.613333333333333e-06,
333
+ "loss": 0.022,
334
+ "step": 1125
335
+ },
336
+ {
337
+ "epoch": 0.18599385411612485,
338
+ "grad_norm": 0.1838596612215042,
339
+ "learning_rate": 8.557777777777778e-06,
340
+ "loss": 0.0215,
341
+ "step": 1150
342
+ },
343
+ {
344
+ "epoch": 0.19003719877082323,
345
+ "grad_norm": 0.31669357419013977,
346
+ "learning_rate": 8.502222222222223e-06,
347
+ "loss": 0.0223,
348
+ "step": 1175
349
+ },
350
+ {
351
+ "epoch": 0.19408054342552158,
352
+ "grad_norm": 0.21895891427993774,
353
+ "learning_rate": 8.446666666666668e-06,
354
+ "loss": 0.0209,
355
+ "step": 1200
356
+ },
357
+ {
358
+ "epoch": 0.19812388808021997,
359
+ "grad_norm": 0.20323023200035095,
360
+ "learning_rate": 8.391111111111112e-06,
361
+ "loss": 0.021,
362
+ "step": 1225
363
+ },
364
+ {
365
+ "epoch": 0.20216723273491832,
366
+ "grad_norm": 0.25468146800994873,
367
+ "learning_rate": 8.335555555555556e-06,
368
+ "loss": 0.0224,
369
+ "step": 1250
370
+ },
371
+ {
372
+ "epoch": 0.20621057738961668,
373
+ "grad_norm": 0.20973573625087738,
374
+ "learning_rate": 8.28e-06,
375
+ "loss": 0.0211,
376
+ "step": 1275
377
+ },
378
+ {
379
+ "epoch": 0.21025392204431506,
380
+ "grad_norm": 0.26390501856803894,
381
+ "learning_rate": 8.224444444444444e-06,
382
+ "loss": 0.0208,
383
+ "step": 1300
384
+ },
385
+ {
386
+ "epoch": 0.21429726669901342,
387
+ "grad_norm": 0.2301158905029297,
388
+ "learning_rate": 8.16888888888889e-06,
389
+ "loss": 0.0205,
390
+ "step": 1325
391
+ },
392
+ {
393
+ "epoch": 0.2183406113537118,
394
+ "grad_norm": 0.24158801138401031,
395
+ "learning_rate": 8.113333333333334e-06,
396
+ "loss": 0.0206,
397
+ "step": 1350
398
+ },
399
+ {
400
+ "epoch": 0.22238395600841016,
401
+ "grad_norm": 0.21493862569332123,
402
+ "learning_rate": 8.057777777777778e-06,
403
+ "loss": 0.0208,
404
+ "step": 1375
405
+ },
406
+ {
407
+ "epoch": 0.2264273006631085,
408
+ "grad_norm": 0.2373913675546646,
409
+ "learning_rate": 8.002222222222222e-06,
410
+ "loss": 0.0198,
411
+ "step": 1400
412
+ },
413
+ {
414
+ "epoch": 0.2304706453178069,
415
+ "grad_norm": 0.2382635623216629,
416
+ "learning_rate": 7.946666666666666e-06,
417
+ "loss": 0.0198,
418
+ "step": 1425
419
+ },
420
+ {
421
+ "epoch": 0.23451398997250525,
422
+ "grad_norm": 0.256558895111084,
423
+ "learning_rate": 7.891111111111112e-06,
424
+ "loss": 0.0206,
425
+ "step": 1450
426
+ },
427
+ {
428
+ "epoch": 0.23855733462720363,
429
+ "grad_norm": 0.23155981302261353,
430
+ "learning_rate": 7.835555555555556e-06,
431
+ "loss": 0.0182,
432
+ "step": 1475
433
+ },
434
+ {
435
+ "epoch": 0.242600679281902,
436
+ "grad_norm": 0.24832689762115479,
437
+ "learning_rate": 7.78e-06,
438
+ "loss": 0.02,
439
+ "step": 1500
440
+ },
441
+ {
442
+ "epoch": 0.24664402393660034,
443
+ "grad_norm": 0.22871288657188416,
444
+ "learning_rate": 7.724444444444446e-06,
445
+ "loss": 0.0193,
446
+ "step": 1525
447
+ },
448
+ {
449
+ "epoch": 0.2506873685912987,
450
+ "grad_norm": 0.2521936297416687,
451
+ "learning_rate": 7.66888888888889e-06,
452
+ "loss": 0.0197,
453
+ "step": 1550
454
+ },
455
+ {
456
+ "epoch": 0.2547307132459971,
457
+ "grad_norm": 0.24429504573345184,
458
+ "learning_rate": 7.613333333333334e-06,
459
+ "loss": 0.0221,
460
+ "step": 1575
461
+ },
462
+ {
463
+ "epoch": 0.25877405790069546,
464
+ "grad_norm": 0.23084107041358948,
465
+ "learning_rate": 7.557777777777779e-06,
466
+ "loss": 0.02,
467
+ "step": 1600
468
+ },
469
+ {
470
+ "epoch": 0.2628174025553938,
471
+ "grad_norm": 0.2697436809539795,
472
+ "learning_rate": 7.502222222222223e-06,
473
+ "loss": 0.0211,
474
+ "step": 1625
475
+ },
476
+ {
477
+ "epoch": 0.2668607472100922,
478
+ "grad_norm": 0.23871037364006042,
479
+ "learning_rate": 7.446666666666668e-06,
480
+ "loss": 0.0214,
481
+ "step": 1650
482
+ },
483
+ {
484
+ "epoch": 0.27090409186479053,
485
+ "grad_norm": 0.2677021920681,
486
+ "learning_rate": 7.3911111111111125e-06,
487
+ "loss": 0.0203,
488
+ "step": 1675
489
+ },
490
+ {
491
+ "epoch": 0.27494743651948894,
492
+ "grad_norm": 0.1928839236497879,
493
+ "learning_rate": 7.335555555555556e-06,
494
+ "loss": 0.0203,
495
+ "step": 1700
496
+ },
497
+ {
498
+ "epoch": 0.2789907811741873,
499
+ "grad_norm": 0.26213887333869934,
500
+ "learning_rate": 7.280000000000001e-06,
501
+ "loss": 0.0205,
502
+ "step": 1725
503
+ },
504
+ {
505
+ "epoch": 0.28303412582888565,
506
+ "grad_norm": 0.23492328822612762,
507
+ "learning_rate": 7.224444444444445e-06,
508
+ "loss": 0.0201,
509
+ "step": 1750
510
+ },
511
+ {
512
+ "epoch": 0.287077470483584,
513
+ "grad_norm": 0.23946373164653778,
514
+ "learning_rate": 7.1688888888888895e-06,
515
+ "loss": 0.0195,
516
+ "step": 1775
517
+ },
518
+ {
519
+ "epoch": 0.29112081513828236,
520
+ "grad_norm": 0.20817314088344574,
521
+ "learning_rate": 7.113333333333334e-06,
522
+ "loss": 0.0211,
523
+ "step": 1800
524
+ },
525
+ {
526
+ "epoch": 0.2951641597929808,
527
+ "grad_norm": 0.22937080264091492,
528
+ "learning_rate": 7.057777777777778e-06,
529
+ "loss": 0.0203,
530
+ "step": 1825
531
+ },
532
+ {
533
+ "epoch": 0.29920750444767913,
534
+ "grad_norm": 0.23442518711090088,
535
+ "learning_rate": 7.0022222222222225e-06,
536
+ "loss": 0.0197,
537
+ "step": 1850
538
+ },
539
+ {
540
+ "epoch": 0.3032508491023775,
541
+ "grad_norm": 0.24289068579673767,
542
+ "learning_rate": 6.946666666666667e-06,
543
+ "loss": 0.0198,
544
+ "step": 1875
545
+ },
546
+ {
547
+ "epoch": 0.30729419375707584,
548
+ "grad_norm": 0.24261179566383362,
549
+ "learning_rate": 6.891111111111111e-06,
550
+ "loss": 0.0211,
551
+ "step": 1900
552
+ },
553
+ {
554
+ "epoch": 0.3113375384117742,
555
+ "grad_norm": 0.21574310958385468,
556
+ "learning_rate": 6.835555555555556e-06,
557
+ "loss": 0.0196,
558
+ "step": 1925
559
+ },
560
+ {
561
+ "epoch": 0.3153808830664726,
562
+ "grad_norm": 0.2648760676383972,
563
+ "learning_rate": 6.780000000000001e-06,
564
+ "loss": 0.021,
565
+ "step": 1950
566
+ },
567
+ {
568
+ "epoch": 0.31942422772117096,
569
+ "grad_norm": 0.22739122807979584,
570
+ "learning_rate": 6.724444444444444e-06,
571
+ "loss": 0.019,
572
+ "step": 1975
573
+ },
574
+ {
575
+ "epoch": 0.3234675723758693,
576
+ "grad_norm": 0.2167060226202011,
577
+ "learning_rate": 6.668888888888889e-06,
578
+ "loss": 0.0208,
579
+ "step": 2000
580
+ },
581
+ {
582
+ "epoch": 0.3234675723758693,
583
+ "eval_loss": 0.025321291759610176,
584
+ "eval_runtime": 14275.0619,
585
+ "eval_samples_per_second": 8.63,
586
+ "eval_steps_per_second": 0.539,
587
+ "eval_wer": 82.5098389256299,
588
+ "step": 2000
589
+ },
590
+ {
591
+ "epoch": 0.32751091703056767,
592
+ "grad_norm": 0.2261808067560196,
593
+ "learning_rate": 6.613333333333334e-06,
594
+ "loss": 0.0192,
595
+ "step": 2025
596
+ },
597
+ {
598
+ "epoch": 0.331554261685266,
599
+ "grad_norm": 0.26066452264785767,
600
+ "learning_rate": 6.557777777777778e-06,
601
+ "loss": 0.0222,
602
+ "step": 2050
603
+ },
604
+ {
605
+ "epoch": 0.33559760633996444,
606
+ "grad_norm": 0.22417497634887695,
607
+ "learning_rate": 6.502222222222223e-06,
608
+ "loss": 0.0204,
609
+ "step": 2075
610
+ },
611
+ {
612
+ "epoch": 0.3396409509946628,
613
+ "grad_norm": 0.20988379418849945,
614
+ "learning_rate": 6.446666666666668e-06,
615
+ "loss": 0.0191,
616
+ "step": 2100
617
+ },
618
+ {
619
+ "epoch": 0.34368429564936115,
620
+ "grad_norm": 0.22671236097812653,
621
+ "learning_rate": 6.391111111111111e-06,
622
+ "loss": 0.0211,
623
+ "step": 2125
624
+ },
625
+ {
626
+ "epoch": 0.3477276403040595,
627
+ "grad_norm": 0.2695440351963043,
628
+ "learning_rate": 6.335555555555556e-06,
629
+ "loss": 0.0197,
630
+ "step": 2150
631
+ },
632
+ {
633
+ "epoch": 0.35177098495875786,
634
+ "grad_norm": 0.20911026000976562,
635
+ "learning_rate": 6.280000000000001e-06,
636
+ "loss": 0.0191,
637
+ "step": 2175
638
+ },
639
+ {
640
+ "epoch": 0.35581432961345627,
641
+ "grad_norm": 0.21661245822906494,
642
+ "learning_rate": 6.224444444444445e-06,
643
+ "loss": 0.0199,
644
+ "step": 2200
645
+ },
646
+ {
647
+ "epoch": 0.3598576742681546,
648
+ "grad_norm": 0.222182035446167,
649
+ "learning_rate": 6.16888888888889e-06,
650
+ "loss": 0.0195,
651
+ "step": 2225
652
+ },
653
+ {
654
+ "epoch": 0.363901018922853,
655
+ "grad_norm": 0.2006169855594635,
656
+ "learning_rate": 6.113333333333333e-06,
657
+ "loss": 0.0199,
658
+ "step": 2250
659
+ },
660
+ {
661
+ "epoch": 0.36794436357755134,
662
+ "grad_norm": 0.22298727929592133,
663
+ "learning_rate": 6.057777777777778e-06,
664
+ "loss": 0.02,
665
+ "step": 2275
666
+ },
667
+ {
668
+ "epoch": 0.3719877082322497,
669
+ "grad_norm": 0.19001857936382294,
670
+ "learning_rate": 6.002222222222223e-06,
671
+ "loss": 0.0197,
672
+ "step": 2300
673
+ },
674
+ {
675
+ "epoch": 0.3760310528869481,
676
+ "grad_norm": 0.2434571385383606,
677
+ "learning_rate": 5.946666666666668e-06,
678
+ "loss": 0.021,
679
+ "step": 2325
680
+ },
681
+ {
682
+ "epoch": 0.38007439754164646,
683
+ "grad_norm": 0.21516510844230652,
684
+ "learning_rate": 5.891111111111112e-06,
685
+ "loss": 0.0193,
686
+ "step": 2350
687
+ },
688
+ {
689
+ "epoch": 0.3841177421963448,
690
+ "grad_norm": 0.24581994116306305,
691
+ "learning_rate": 5.8355555555555565e-06,
692
+ "loss": 0.0199,
693
+ "step": 2375
694
+ },
695
+ {
696
+ "epoch": 0.38816108685104317,
697
+ "grad_norm": 0.23675447702407837,
698
+ "learning_rate": 5.78e-06,
699
+ "loss": 0.0195,
700
+ "step": 2400
701
+ },
702
+ {
703
+ "epoch": 0.3922044315057415,
704
+ "grad_norm": 0.21948282420635223,
705
+ "learning_rate": 5.724444444444445e-06,
706
+ "loss": 0.0194,
707
+ "step": 2425
708
+ },
709
+ {
710
+ "epoch": 0.39624777616043994,
711
+ "grad_norm": 0.21700553596019745,
712
+ "learning_rate": 5.6688888888888895e-06,
713
+ "loss": 0.0188,
714
+ "step": 2450
715
+ },
716
+ {
717
+ "epoch": 0.4002911208151383,
718
+ "grad_norm": 0.24019096791744232,
719
+ "learning_rate": 5.613333333333334e-06,
720
+ "loss": 0.0211,
721
+ "step": 2475
722
+ },
723
+ {
724
+ "epoch": 0.40433446546983665,
725
+ "grad_norm": 0.21174991130828857,
726
+ "learning_rate": 5.557777777777778e-06,
727
+ "loss": 0.0207,
728
+ "step": 2500
729
+ },
730
+ {
731
+ "epoch": 0.408377810124535,
732
+ "grad_norm": 0.2548556625843048,
733
+ "learning_rate": 5.5022222222222224e-06,
734
+ "loss": 0.0202,
735
+ "step": 2525
736
+ },
737
+ {
738
+ "epoch": 0.41242115477923336,
739
+ "grad_norm": 0.2106471061706543,
740
+ "learning_rate": 5.4466666666666665e-06,
741
+ "loss": 0.0206,
742
+ "step": 2550
743
+ },
744
+ {
745
+ "epoch": 0.41646449943393177,
746
+ "grad_norm": 0.20076259970664978,
747
+ "learning_rate": 5.391111111111111e-06,
748
+ "loss": 0.0188,
749
+ "step": 2575
750
+ },
751
+ {
752
+ "epoch": 0.4205078440886301,
753
+ "grad_norm": 0.20281557738780975,
754
+ "learning_rate": 5.335555555555556e-06,
755
+ "loss": 0.0189,
756
+ "step": 2600
757
+ },
758
+ {
759
+ "epoch": 0.4245511887433285,
760
+ "grad_norm": 0.23585236072540283,
761
+ "learning_rate": 5.28e-06,
762
+ "loss": 0.0193,
763
+ "step": 2625
764
+ },
765
+ {
766
+ "epoch": 0.42859453339802683,
767
+ "grad_norm": 0.20076635479927063,
768
+ "learning_rate": 5.224444444444445e-06,
769
+ "loss": 0.0188,
770
+ "step": 2650
771
+ },
772
+ {
773
+ "epoch": 0.4326378780527252,
774
+ "grad_norm": 0.20363172888755798,
775
+ "learning_rate": 5.168888888888889e-06,
776
+ "loss": 0.0188,
777
+ "step": 2675
778
+ },
779
+ {
780
+ "epoch": 0.4366812227074236,
781
+ "grad_norm": 0.20968079566955566,
782
+ "learning_rate": 5.113333333333333e-06,
783
+ "loss": 0.0198,
784
+ "step": 2700
785
+ },
786
+ {
787
+ "epoch": 0.44072456736212196,
788
+ "grad_norm": 0.2075379192829132,
789
+ "learning_rate": 5.057777777777778e-06,
790
+ "loss": 0.0192,
791
+ "step": 2725
792
+ },
793
+ {
794
+ "epoch": 0.4447679120168203,
795
+ "grad_norm": 0.19762763381004333,
796
+ "learning_rate": 5.002222222222223e-06,
797
+ "loss": 0.0192,
798
+ "step": 2750
799
+ },
800
+ {
801
+ "epoch": 0.44881125667151867,
802
+ "grad_norm": 0.2090006172657013,
803
+ "learning_rate": 4.946666666666667e-06,
804
+ "loss": 0.0196,
805
+ "step": 2775
806
+ },
807
+ {
808
+ "epoch": 0.452854601326217,
809
+ "grad_norm": 0.20064634084701538,
810
+ "learning_rate": 4.891111111111111e-06,
811
+ "loss": 0.0187,
812
+ "step": 2800
813
+ },
814
+ {
815
+ "epoch": 0.45689794598091543,
816
+ "grad_norm": 0.20843106508255005,
817
+ "learning_rate": 4.835555555555556e-06,
818
+ "loss": 0.0192,
819
+ "step": 2825
820
+ },
821
+ {
822
+ "epoch": 0.4609412906356138,
823
+ "grad_norm": 0.22503884136676788,
824
+ "learning_rate": 4.78e-06,
825
+ "loss": 0.0195,
826
+ "step": 2850
827
+ },
828
+ {
829
+ "epoch": 0.46498463529031214,
830
+ "grad_norm": 0.22823262214660645,
831
+ "learning_rate": 4.724444444444445e-06,
832
+ "loss": 0.0199,
833
+ "step": 2875
834
+ },
835
+ {
836
+ "epoch": 0.4690279799450105,
837
+ "grad_norm": 0.24186566472053528,
838
+ "learning_rate": 4.66888888888889e-06,
839
+ "loss": 0.0194,
840
+ "step": 2900
841
+ },
842
+ {
843
+ "epoch": 0.47307132459970885,
844
+ "grad_norm": 0.27571889758110046,
845
+ "learning_rate": 4.613333333333334e-06,
846
+ "loss": 0.0194,
847
+ "step": 2925
848
+ },
849
+ {
850
+ "epoch": 0.47711466925440726,
851
+ "grad_norm": 0.22955967485904694,
852
+ "learning_rate": 4.557777777777778e-06,
853
+ "loss": 0.0201,
854
+ "step": 2950
855
+ },
856
+ {
857
+ "epoch": 0.4811580139091056,
858
+ "grad_norm": 0.21334044635295868,
859
+ "learning_rate": 4.502222222222223e-06,
860
+ "loss": 0.0204,
861
+ "step": 2975
862
+ },
863
+ {
864
+ "epoch": 0.485201358563804,
865
+ "grad_norm": 0.20334798097610474,
866
+ "learning_rate": 4.446666666666667e-06,
867
+ "loss": 0.0207,
868
+ "step": 3000
869
+ },
870
+ {
871
+ "epoch": 0.485201358563804,
872
+ "eval_loss": 0.024814918637275696,
873
+ "eval_runtime": 14244.4286,
874
+ "eval_samples_per_second": 8.649,
875
+ "eval_steps_per_second": 0.541,
876
+ "eval_wer": 81.96210492149146,
877
+ "step": 3000
878
+ },
879
+ {
880
+ "epoch": 0.48924470321850233,
881
+ "grad_norm": 0.23993884027004242,
882
+ "learning_rate": 4.391111111111112e-06,
883
+ "loss": 0.0192,
884
+ "step": 3025
885
+ },
886
+ {
887
+ "epoch": 0.4932880478732007,
888
+ "grad_norm": 0.19718846678733826,
889
+ "learning_rate": 4.3355555555555565e-06,
890
+ "loss": 0.0202,
891
+ "step": 3050
892
+ },
893
+ {
894
+ "epoch": 0.4973313925278991,
895
+ "grad_norm": 0.21749120950698853,
896
+ "learning_rate": 4.2800000000000005e-06,
897
+ "loss": 0.0191,
898
+ "step": 3075
899
+ },
900
+ {
901
+ "epoch": 0.5013747371825974,
902
+ "grad_norm": 0.22544993460178375,
903
+ "learning_rate": 4.2244444444444446e-06,
904
+ "loss": 0.0189,
905
+ "step": 3100
906
+ },
907
+ {
908
+ "epoch": 0.5054180818372959,
909
+ "grad_norm": 0.20528769493103027,
910
+ "learning_rate": 4.168888888888889e-06,
911
+ "loss": 0.0195,
912
+ "step": 3125
913
+ },
914
+ {
915
+ "epoch": 0.5094614264919942,
916
+ "grad_norm": 0.2464672178030014,
917
+ "learning_rate": 4.1133333333333335e-06,
918
+ "loss": 0.0184,
919
+ "step": 3150
920
+ },
921
+ {
922
+ "epoch": 0.5135047711466926,
923
+ "grad_norm": 0.22547754645347595,
924
+ "learning_rate": 4.057777777777778e-06,
925
+ "loss": 0.0202,
926
+ "step": 3175
927
+ },
928
+ {
929
+ "epoch": 0.5175481158013909,
930
+ "grad_norm": 0.1981554925441742,
931
+ "learning_rate": 4.002222222222222e-06,
932
+ "loss": 0.0196,
933
+ "step": 3200
934
+ },
935
+ {
936
+ "epoch": 0.5215914604560893,
937
+ "grad_norm": 0.23502561450004578,
938
+ "learning_rate": 3.946666666666667e-06,
939
+ "loss": 0.0188,
940
+ "step": 3225
941
+ },
942
+ {
943
+ "epoch": 0.5256348051107876,
944
+ "grad_norm": 0.23244182765483856,
945
+ "learning_rate": 3.891111111111111e-06,
946
+ "loss": 0.0194,
947
+ "step": 3250
948
+ },
949
+ {
950
+ "epoch": 0.529678149765486,
951
+ "grad_norm": 0.23334389925003052,
952
+ "learning_rate": 3.835555555555555e-06,
953
+ "loss": 0.019,
954
+ "step": 3275
955
+ },
956
+ {
957
+ "epoch": 0.5337214944201844,
958
+ "grad_norm": 0.24255788326263428,
959
+ "learning_rate": 3.7800000000000002e-06,
960
+ "loss": 0.019,
961
+ "step": 3300
962
+ },
963
+ {
964
+ "epoch": 0.5377648390748827,
965
+ "grad_norm": 0.19664430618286133,
966
+ "learning_rate": 3.724444444444445e-06,
967
+ "loss": 0.0186,
968
+ "step": 3325
969
+ },
970
+ {
971
+ "epoch": 0.5418081837295811,
972
+ "grad_norm": 0.23343808948993683,
973
+ "learning_rate": 3.668888888888889e-06,
974
+ "loss": 0.0185,
975
+ "step": 3350
976
+ },
977
+ {
978
+ "epoch": 0.5458515283842795,
979
+ "grad_norm": 0.22261999547481537,
980
+ "learning_rate": 3.6133333333333336e-06,
981
+ "loss": 0.0185,
982
+ "step": 3375
983
+ },
984
+ {
985
+ "epoch": 0.5498948730389779,
986
+ "grad_norm": 0.26387473940849304,
987
+ "learning_rate": 3.5577777777777785e-06,
988
+ "loss": 0.0191,
989
+ "step": 3400
990
+ },
991
+ {
992
+ "epoch": 0.5539382176936762,
993
+ "grad_norm": 0.2524121105670929,
994
+ "learning_rate": 3.5022222222222225e-06,
995
+ "loss": 0.019,
996
+ "step": 3425
997
+ },
998
+ {
999
+ "epoch": 0.5579815623483746,
1000
+ "grad_norm": 0.21356618404388428,
1001
+ "learning_rate": 3.446666666666667e-06,
1002
+ "loss": 0.0202,
1003
+ "step": 3450
1004
+ },
1005
+ {
1006
+ "epoch": 0.562024907003073,
1007
+ "grad_norm": 0.20150095224380493,
1008
+ "learning_rate": 3.391111111111111e-06,
1009
+ "loss": 0.0199,
1010
+ "step": 3475
1011
+ },
1012
+ {
1013
+ "epoch": 0.5660682516577713,
1014
+ "grad_norm": 0.20904186367988586,
1015
+ "learning_rate": 3.335555555555556e-06,
1016
+ "loss": 0.0187,
1017
+ "step": 3500
1018
+ },
1019
+ {
1020
+ "epoch": 0.5701115963124697,
1021
+ "grad_norm": 0.2027483582496643,
1022
+ "learning_rate": 3.2800000000000004e-06,
1023
+ "loss": 0.0184,
1024
+ "step": 3525
1025
+ },
1026
+ {
1027
+ "epoch": 0.574154940967168,
1028
+ "grad_norm": 0.23289534449577332,
1029
+ "learning_rate": 3.2244444444444444e-06,
1030
+ "loss": 0.0196,
1031
+ "step": 3550
1032
+ },
1033
+ {
1034
+ "epoch": 0.5781982856218664,
1035
+ "grad_norm": 0.23588407039642334,
1036
+ "learning_rate": 3.1688888888888893e-06,
1037
+ "loss": 0.0179,
1038
+ "step": 3575
1039
+ },
1040
+ {
1041
+ "epoch": 0.5822416302765647,
1042
+ "grad_norm": 0.2624644935131073,
1043
+ "learning_rate": 3.1133333333333337e-06,
1044
+ "loss": 0.0186,
1045
+ "step": 3600
1046
+ },
1047
+ {
1048
+ "epoch": 0.5862849749312632,
1049
+ "grad_norm": 0.21287862956523895,
1050
+ "learning_rate": 3.0577777777777778e-06,
1051
+ "loss": 0.0192,
1052
+ "step": 3625
1053
+ },
1054
+ {
1055
+ "epoch": 0.5903283195859615,
1056
+ "grad_norm": 0.24328123033046722,
1057
+ "learning_rate": 3.0022222222222227e-06,
1058
+ "loss": 0.0191,
1059
+ "step": 3650
1060
+ },
1061
+ {
1062
+ "epoch": 0.5943716642406599,
1063
+ "grad_norm": 0.1968221366405487,
1064
+ "learning_rate": 2.946666666666667e-06,
1065
+ "loss": 0.0189,
1066
+ "step": 3675
1067
+ },
1068
+ {
1069
+ "epoch": 0.5984150088953583,
1070
+ "grad_norm": 0.22048558294773102,
1071
+ "learning_rate": 2.891111111111111e-06,
1072
+ "loss": 0.0193,
1073
+ "step": 3700
1074
+ },
1075
+ {
1076
+ "epoch": 0.6024583535500566,
1077
+ "grad_norm": 0.23322191834449768,
1078
+ "learning_rate": 2.835555555555556e-06,
1079
+ "loss": 0.0203,
1080
+ "step": 3725
1081
+ },
1082
+ {
1083
+ "epoch": 0.606501698204755,
1084
+ "grad_norm": 0.2503873407840729,
1085
+ "learning_rate": 2.7800000000000005e-06,
1086
+ "loss": 0.0187,
1087
+ "step": 3750
1088
+ },
1089
+ {
1090
+ "epoch": 0.6105450428594533,
1091
+ "grad_norm": 0.23583608865737915,
1092
+ "learning_rate": 2.7244444444444445e-06,
1093
+ "loss": 0.0182,
1094
+ "step": 3775
1095
+ },
1096
+ {
1097
+ "epoch": 0.6145883875141517,
1098
+ "grad_norm": 0.21786954998970032,
1099
+ "learning_rate": 2.6688888888888894e-06,
1100
+ "loss": 0.0187,
1101
+ "step": 3800
1102
+ },
1103
+ {
1104
+ "epoch": 0.61863173216885,
1105
+ "grad_norm": 0.21708305180072784,
1106
+ "learning_rate": 2.6133333333333334e-06,
1107
+ "loss": 0.0207,
1108
+ "step": 3825
1109
+ },
1110
+ {
1111
+ "epoch": 0.6226750768235484,
1112
+ "grad_norm": 0.21468117833137512,
1113
+ "learning_rate": 2.557777777777778e-06,
1114
+ "loss": 0.0206,
1115
+ "step": 3850
1116
+ },
1117
+ {
1118
+ "epoch": 0.6267184214782469,
1119
+ "grad_norm": 0.2351790964603424,
1120
+ "learning_rate": 2.5022222222222224e-06,
1121
+ "loss": 0.0184,
1122
+ "step": 3875
1123
+ },
1124
+ {
1125
+ "epoch": 0.6307617661329452,
1126
+ "grad_norm": 0.20362141728401184,
1127
+ "learning_rate": 2.446666666666667e-06,
1128
+ "loss": 0.0181,
1129
+ "step": 3900
1130
+ },
1131
+ {
1132
+ "epoch": 0.6348051107876436,
1133
+ "grad_norm": 0.22791029512882233,
1134
+ "learning_rate": 2.3911111111111113e-06,
1135
+ "loss": 0.0176,
1136
+ "step": 3925
1137
+ },
1138
+ {
1139
+ "epoch": 0.6388484554423419,
1140
+ "grad_norm": 0.23184864223003387,
1141
+ "learning_rate": 2.3355555555555557e-06,
1142
+ "loss": 0.0201,
1143
+ "step": 3950
1144
+ },
1145
+ {
1146
+ "epoch": 0.6428918000970403,
1147
+ "grad_norm": 0.213025763630867,
1148
+ "learning_rate": 2.28e-06,
1149
+ "loss": 0.0178,
1150
+ "step": 3975
1151
+ },
1152
+ {
1153
+ "epoch": 0.6469351447517386,
1154
+ "grad_norm": 0.2062954306602478,
1155
+ "learning_rate": 2.2244444444444447e-06,
1156
+ "loss": 0.0198,
1157
+ "step": 4000
1158
+ },
1159
+ {
1160
+ "epoch": 0.6469351447517386,
1161
+ "eval_loss": 0.02430112473666668,
1162
+ "eval_runtime": 14235.4068,
1163
+ "eval_samples_per_second": 8.654,
1164
+ "eval_steps_per_second": 0.541,
1165
+ "eval_wer": 81.67647178155556,
1166
+ "step": 4000
1167
+ }
1168
+ ],
1169
+ "logging_steps": 25,
1170
+ "max_steps": 5000,
1171
+ "num_input_tokens_seen": 0,
1172
+ "num_train_epochs": 1,
1173
+ "save_steps": 1000,
1174
+ "stateful_callbacks": {
1175
+ "TrainerControl": {
1176
+ "args": {
1177
+ "should_epoch_stop": false,
1178
+ "should_evaluate": false,
1179
+ "should_log": false,
1180
+ "should_save": true,
1181
+ "should_training_stop": false
1182
+ },
1183
+ "attributes": {}
1184
+ }
1185
+ },
1186
+ "total_flos": 3.693893124096e+19,
1187
+ "train_batch_size": 32,
1188
+ "trial_name": null,
1189
+ "trial_params": null
1190
+ }
checkpoint-4000/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f9f7caca91e9e74c1f1f32c6f743e7a01e94bbb4a987606d363be4bc023f04fb
3
+ size 5496
checkpoint-5000/config.json ADDED
@@ -0,0 +1,60 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "activation_dropout": 0.0,
3
+ "activation_function": "gelu",
4
+ "apply_spec_augment": false,
5
+ "architectures": [
6
+ "WhisperForConditionalGeneration"
7
+ ],
8
+ "attention_dropout": 0.0,
9
+ "begin_suppress_tokens": null,
10
+ "bos_token_id": 50257,
11
+ "classifier_proj_size": 256,
12
+ "d_model": 768,
13
+ "decoder_attention_heads": 12,
14
+ "decoder_ffn_dim": 3072,
15
+ "decoder_layerdrop": 0.0,
16
+ "decoder_layers": 12,
17
+ "decoder_start_token_id": 50258,
18
+ "dropout": 0.0,
19
+ "encoder_attention_heads": 12,
20
+ "encoder_ffn_dim": 3072,
21
+ "encoder_layerdrop": 0.0,
22
+ "encoder_layers": 12,
23
+ "eos_token_id": 50257,
24
+ "forced_decoder_ids": [
25
+ [
26
+ 1,
27
+ 50259
28
+ ],
29
+ [
30
+ 2,
31
+ 50359
32
+ ],
33
+ [
34
+ 3,
35
+ 50363
36
+ ]
37
+ ],
38
+ "init_std": 0.02,
39
+ "is_encoder_decoder": true,
40
+ "mask_feature_length": 10,
41
+ "mask_feature_min_masks": 0,
42
+ "mask_feature_prob": 0.0,
43
+ "mask_time_length": 10,
44
+ "mask_time_min_masks": 2,
45
+ "mask_time_prob": 0.05,
46
+ "max_length": null,
47
+ "max_source_positions": 1500,
48
+ "max_target_positions": 448,
49
+ "median_filter_width": 7,
50
+ "model_type": "whisper",
51
+ "num_hidden_layers": 12,
52
+ "num_mel_bins": 80,
53
+ "pad_token_id": 50257,
54
+ "scale_embedding": false,
55
+ "torch_dtype": "float32",
56
+ "transformers_version": "4.51.3",
57
+ "use_cache": true,
58
+ "use_weighted_layer_sum": false,
59
+ "vocab_size": 51865
60
+ }
checkpoint-5000/generation_config.json ADDED
@@ -0,0 +1,254 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alignment_heads": [
3
+ [
4
+ 5,
5
+ 3
6
+ ],
7
+ [
8
+ 5,
9
+ 9
10
+ ],
11
+ [
12
+ 8,
13
+ 0
14
+ ],
15
+ [
16
+ 8,
17
+ 4
18
+ ],
19
+ [
20
+ 8,
21
+ 7
22
+ ],
23
+ [
24
+ 8,
25
+ 8
26
+ ],
27
+ [
28
+ 9,
29
+ 0
30
+ ],
31
+ [
32
+ 9,
33
+ 7
34
+ ],
35
+ [
36
+ 9,
37
+ 9
38
+ ],
39
+ [
40
+ 10,
41
+ 5
42
+ ]
43
+ ],
44
+ "begin_suppress_tokens": [
45
+ 220,
46
+ 50257
47
+ ],
48
+ "bos_token_id": 50257,
49
+ "decoder_start_token_id": 50258,
50
+ "eos_token_id": 50257,
51
+ "is_multilingual": true,
52
+ "lang_to_id": {
53
+ "<|af|>": 50327,
54
+ "<|am|>": 50334,
55
+ "<|ar|>": 50272,
56
+ "<|as|>": 50350,
57
+ "<|az|>": 50304,
58
+ "<|ba|>": 50355,
59
+ "<|be|>": 50330,
60
+ "<|bg|>": 50292,
61
+ "<|bn|>": 50302,
62
+ "<|bo|>": 50347,
63
+ "<|br|>": 50309,
64
+ "<|bs|>": 50315,
65
+ "<|ca|>": 50270,
66
+ "<|cs|>": 50283,
67
+ "<|cy|>": 50297,
68
+ "<|da|>": 50285,
69
+ "<|de|>": 50261,
70
+ "<|el|>": 50281,
71
+ "<|en|>": 50259,
72
+ "<|es|>": 50262,
73
+ "<|et|>": 50307,
74
+ "<|eu|>": 50310,
75
+ "<|fa|>": 50300,
76
+ "<|fi|>": 50277,
77
+ "<|fo|>": 50338,
78
+ "<|fr|>": 50265,
79
+ "<|gl|>": 50319,
80
+ "<|gu|>": 50333,
81
+ "<|haw|>": 50352,
82
+ "<|ha|>": 50354,
83
+ "<|he|>": 50279,
84
+ "<|hi|>": 50276,
85
+ "<|hr|>": 50291,
86
+ "<|ht|>": 50339,
87
+ "<|hu|>": 50286,
88
+ "<|hy|>": 50312,
89
+ "<|id|>": 50275,
90
+ "<|is|>": 50311,
91
+ "<|it|>": 50274,
92
+ "<|ja|>": 50266,
93
+ "<|jw|>": 50356,
94
+ "<|ka|>": 50329,
95
+ "<|kk|>": 50316,
96
+ "<|km|>": 50323,
97
+ "<|kn|>": 50306,
98
+ "<|ko|>": 50264,
99
+ "<|la|>": 50294,
100
+ "<|lb|>": 50345,
101
+ "<|ln|>": 50353,
102
+ "<|lo|>": 50336,
103
+ "<|lt|>": 50293,
104
+ "<|lv|>": 50301,
105
+ "<|mg|>": 50349,
106
+ "<|mi|>": 50295,
107
+ "<|mk|>": 50308,
108
+ "<|ml|>": 50296,
109
+ "<|mn|>": 50314,
110
+ "<|mr|>": 50320,
111
+ "<|ms|>": 50282,
112
+ "<|mt|>": 50343,
113
+ "<|my|>": 50346,
114
+ "<|ne|>": 50313,
115
+ "<|nl|>": 50271,
116
+ "<|nn|>": 50342,
117
+ "<|no|>": 50288,
118
+ "<|oc|>": 50328,
119
+ "<|pa|>": 50321,
120
+ "<|pl|>": 50269,
121
+ "<|ps|>": 50340,
122
+ "<|pt|>": 50267,
123
+ "<|ro|>": 50284,
124
+ "<|ru|>": 50263,
125
+ "<|sa|>": 50344,
126
+ "<|sd|>": 50332,
127
+ "<|si|>": 50322,
128
+ "<|sk|>": 50298,
129
+ "<|sl|>": 50305,
130
+ "<|sn|>": 50324,
131
+ "<|so|>": 50326,
132
+ "<|sq|>": 50317,
133
+ "<|sr|>": 50303,
134
+ "<|su|>": 50357,
135
+ "<|sv|>": 50273,
136
+ "<|sw|>": 50318,
137
+ "<|ta|>": 50287,
138
+ "<|te|>": 50299,
139
+ "<|tg|>": 50331,
140
+ "<|th|>": 50289,
141
+ "<|tk|>": 50341,
142
+ "<|tl|>": 50348,
143
+ "<|tr|>": 50268,
144
+ "<|tt|>": 50351,
145
+ "<|uk|>": 50280,
146
+ "<|ur|>": 50290,
147
+ "<|uz|>": 50337,
148
+ "<|vi|>": 50278,
149
+ "<|yi|>": 50335,
150
+ "<|yo|>": 50325,
151
+ "<|zh|>": 50260
152
+ },
153
+ "language": "chinese",
154
+ "max_initial_timestamp_index": 50,
155
+ "max_length": 448,
156
+ "no_timestamps_token_id": 50363,
157
+ "pad_token_id": 50257,
158
+ "prev_sot_token_id": 50361,
159
+ "return_timestamps": false,
160
+ "suppress_tokens": [
161
+ 1,
162
+ 2,
163
+ 7,
164
+ 8,
165
+ 9,
166
+ 10,
167
+ 14,
168
+ 25,
169
+ 26,
170
+ 27,
171
+ 28,
172
+ 29,
173
+ 31,
174
+ 58,
175
+ 59,
176
+ 60,
177
+ 61,
178
+ 62,
179
+ 63,
180
+ 90,
181
+ 91,
182
+ 92,
183
+ 93,
184
+ 359,
185
+ 503,
186
+ 522,
187
+ 542,
188
+ 873,
189
+ 893,
190
+ 902,
191
+ 918,
192
+ 922,
193
+ 931,
194
+ 1350,
195
+ 1853,
196
+ 1982,
197
+ 2460,
198
+ 2627,
199
+ 3246,
200
+ 3253,
201
+ 3268,
202
+ 3536,
203
+ 3846,
204
+ 3961,
205
+ 4183,
206
+ 4667,
207
+ 6585,
208
+ 6647,
209
+ 7273,
210
+ 9061,
211
+ 9383,
212
+ 10428,
213
+ 10929,
214
+ 11938,
215
+ 12033,
216
+ 12331,
217
+ 12562,
218
+ 13793,
219
+ 14157,
220
+ 14635,
221
+ 15265,
222
+ 15618,
223
+ 16553,
224
+ 16604,
225
+ 18362,
226
+ 18956,
227
+ 20075,
228
+ 21675,
229
+ 22520,
230
+ 26130,
231
+ 26161,
232
+ 26435,
233
+ 28279,
234
+ 29464,
235
+ 31650,
236
+ 32302,
237
+ 32470,
238
+ 36865,
239
+ 42863,
240
+ 47425,
241
+ 49870,
242
+ 50254,
243
+ 50258,
244
+ 50360,
245
+ 50361,
246
+ 50362
247
+ ],
248
+ "task": "transcribe",
249
+ "task_to_id": {
250
+ "transcribe": 50359,
251
+ "translate": 50358
252
+ },
253
+ "transformers_version": "4.51.3"
254
+ }
checkpoint-5000/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2a55144040a6e7b23d09e36e30316ad1cd153f71fb213853d9e0f9baf0025f96
3
+ size 966995080
checkpoint-5000/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5b53d6b15e9a25367dd1d6e6bd95b501aa1fba1d01d4ee9458f64b41a591b0cd
3
+ size 1925064044
checkpoint-5000/preprocessor_config.json ADDED
@@ -0,0 +1,15 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "chunk_length": 30,
3
+ "dither": 0.0,
4
+ "feature_extractor_type": "WhisperFeatureExtractor",
5
+ "feature_size": 80,
6
+ "hop_length": 160,
7
+ "n_fft": 400,
8
+ "n_samples": 480000,
9
+ "nb_max_frames": 3000,
10
+ "padding_side": "right",
11
+ "padding_value": 0.0,
12
+ "processor_class": "WhisperProcessor",
13
+ "return_attention_mask": false,
14
+ "sampling_rate": 16000
15
+ }
checkpoint-5000/rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c14630f44585d74464a51bc46b7600a9dcd16f30ed1f5cb7e67927cf66412e42
3
+ size 14244
checkpoint-5000/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3d6123b92a1a964482602c80eafadd952483e879e0467db7e659b5938ddcc1ab
3
+ size 1064
checkpoint-5000/trainer_state.json ADDED
@@ -0,0 +1,1479 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_global_step": 5000,
3
+ "best_metric": 81.25857102284255,
4
+ "best_model_checkpoint": "./working_area/output_model/checkpoint-5000",
5
+ "epoch": 0.8086689309396733,
6
+ "eval_steps": 1000,
7
+ "global_step": 5000,
8
+ "is_hyper_param_search": false,
9
+ "is_local_process_zero": true,
10
+ "is_world_process_zero": true,
11
+ "log_history": [
12
+ {
13
+ "epoch": 0.004043344654698367,
14
+ "grad_norm": 10.67656421661377,
15
+ "learning_rate": 4.800000000000001e-07,
16
+ "loss": 0.2728,
17
+ "step": 25
18
+ },
19
+ {
20
+ "epoch": 0.008086689309396733,
21
+ "grad_norm": 0.34478363394737244,
22
+ "learning_rate": 9.800000000000001e-07,
23
+ "loss": 0.0358,
24
+ "step": 50
25
+ },
26
+ {
27
+ "epoch": 0.012130033964095099,
28
+ "grad_norm": 0.26953190565109253,
29
+ "learning_rate": 1.48e-06,
30
+ "loss": 0.0261,
31
+ "step": 75
32
+ },
33
+ {
34
+ "epoch": 0.016173378618793467,
35
+ "grad_norm": 0.2419331669807434,
36
+ "learning_rate": 1.98e-06,
37
+ "loss": 0.0251,
38
+ "step": 100
39
+ },
40
+ {
41
+ "epoch": 0.020216723273491832,
42
+ "grad_norm": 0.29993513226509094,
43
+ "learning_rate": 2.4800000000000004e-06,
44
+ "loss": 0.0247,
45
+ "step": 125
46
+ },
47
+ {
48
+ "epoch": 0.024260067928190198,
49
+ "grad_norm": 0.2362569123506546,
50
+ "learning_rate": 2.9800000000000003e-06,
51
+ "loss": 0.0227,
52
+ "step": 150
53
+ },
54
+ {
55
+ "epoch": 0.028303412582888564,
56
+ "grad_norm": 0.2520463466644287,
57
+ "learning_rate": 3.48e-06,
58
+ "loss": 0.0246,
59
+ "step": 175
60
+ },
61
+ {
62
+ "epoch": 0.03234675723758693,
63
+ "grad_norm": 0.2729567885398865,
64
+ "learning_rate": 3.980000000000001e-06,
65
+ "loss": 0.0236,
66
+ "step": 200
67
+ },
68
+ {
69
+ "epoch": 0.036390101892285295,
70
+ "grad_norm": 0.21665456891059875,
71
+ "learning_rate": 4.48e-06,
72
+ "loss": 0.0242,
73
+ "step": 225
74
+ },
75
+ {
76
+ "epoch": 0.040433446546983665,
77
+ "grad_norm": 0.27291786670684814,
78
+ "learning_rate": 4.980000000000001e-06,
79
+ "loss": 0.0242,
80
+ "step": 250
81
+ },
82
+ {
83
+ "epoch": 0.044476791201682034,
84
+ "grad_norm": 0.20698551833629608,
85
+ "learning_rate": 5.480000000000001e-06,
86
+ "loss": 0.0229,
87
+ "step": 275
88
+ },
89
+ {
90
+ "epoch": 0.048520135856380396,
91
+ "grad_norm": 0.2506471276283264,
92
+ "learning_rate": 5.98e-06,
93
+ "loss": 0.0225,
94
+ "step": 300
95
+ },
96
+ {
97
+ "epoch": 0.052563480511078765,
98
+ "grad_norm": 0.24540211260318756,
99
+ "learning_rate": 6.480000000000001e-06,
100
+ "loss": 0.0231,
101
+ "step": 325
102
+ },
103
+ {
104
+ "epoch": 0.05660682516577713,
105
+ "grad_norm": 0.2298947423696518,
106
+ "learning_rate": 6.98e-06,
107
+ "loss": 0.0223,
108
+ "step": 350
109
+ },
110
+ {
111
+ "epoch": 0.0606501698204755,
112
+ "grad_norm": 0.19595475494861603,
113
+ "learning_rate": 7.48e-06,
114
+ "loss": 0.022,
115
+ "step": 375
116
+ },
117
+ {
118
+ "epoch": 0.06469351447517387,
119
+ "grad_norm": 0.2605821490287781,
120
+ "learning_rate": 7.980000000000002e-06,
121
+ "loss": 0.022,
122
+ "step": 400
123
+ },
124
+ {
125
+ "epoch": 0.06873685912987224,
126
+ "grad_norm": 0.25346869230270386,
127
+ "learning_rate": 8.48e-06,
128
+ "loss": 0.0226,
129
+ "step": 425
130
+ },
131
+ {
132
+ "epoch": 0.07278020378457059,
133
+ "grad_norm": 0.255500465631485,
134
+ "learning_rate": 8.98e-06,
135
+ "loss": 0.0216,
136
+ "step": 450
137
+ },
138
+ {
139
+ "epoch": 0.07682354843926896,
140
+ "grad_norm": 0.24462725222110748,
141
+ "learning_rate": 9.48e-06,
142
+ "loss": 0.0234,
143
+ "step": 475
144
+ },
145
+ {
146
+ "epoch": 0.08086689309396733,
147
+ "grad_norm": 0.23454348742961884,
148
+ "learning_rate": 9.980000000000001e-06,
149
+ "loss": 0.024,
150
+ "step": 500
151
+ },
152
+ {
153
+ "epoch": 0.0849102377486657,
154
+ "grad_norm": 0.2627425193786621,
155
+ "learning_rate": 9.946666666666667e-06,
156
+ "loss": 0.0226,
157
+ "step": 525
158
+ },
159
+ {
160
+ "epoch": 0.08895358240336407,
161
+ "grad_norm": 0.22754116356372833,
162
+ "learning_rate": 9.891111111111113e-06,
163
+ "loss": 0.023,
164
+ "step": 550
165
+ },
166
+ {
167
+ "epoch": 0.09299692705806242,
168
+ "grad_norm": 0.26447415351867676,
169
+ "learning_rate": 9.835555555555556e-06,
170
+ "loss": 0.0215,
171
+ "step": 575
172
+ },
173
+ {
174
+ "epoch": 0.09704027171276079,
175
+ "grad_norm": 0.27743828296661377,
176
+ "learning_rate": 9.780000000000001e-06,
177
+ "loss": 0.0215,
178
+ "step": 600
179
+ },
180
+ {
181
+ "epoch": 0.10108361636745916,
182
+ "grad_norm": 0.23157595098018646,
183
+ "learning_rate": 9.724444444444445e-06,
184
+ "loss": 0.023,
185
+ "step": 625
186
+ },
187
+ {
188
+ "epoch": 0.10512696102215753,
189
+ "grad_norm": 0.2561393082141876,
190
+ "learning_rate": 9.66888888888889e-06,
191
+ "loss": 0.0209,
192
+ "step": 650
193
+ },
194
+ {
195
+ "epoch": 0.1091703056768559,
196
+ "grad_norm": 0.2518314719200134,
197
+ "learning_rate": 9.613333333333335e-06,
198
+ "loss": 0.0235,
199
+ "step": 675
200
+ },
201
+ {
202
+ "epoch": 0.11321365033155426,
203
+ "grad_norm": 0.2346208244562149,
204
+ "learning_rate": 9.557777777777777e-06,
205
+ "loss": 0.0222,
206
+ "step": 700
207
+ },
208
+ {
209
+ "epoch": 0.11725699498625262,
210
+ "grad_norm": 0.23254457116127014,
211
+ "learning_rate": 9.502222222222223e-06,
212
+ "loss": 0.0216,
213
+ "step": 725
214
+ },
215
+ {
216
+ "epoch": 0.121300339640951,
217
+ "grad_norm": 0.2728439271450043,
218
+ "learning_rate": 9.446666666666667e-06,
219
+ "loss": 0.0208,
220
+ "step": 750
221
+ },
222
+ {
223
+ "epoch": 0.12534368429564935,
224
+ "grad_norm": 0.24401770532131195,
225
+ "learning_rate": 9.391111111111111e-06,
226
+ "loss": 0.0222,
227
+ "step": 775
228
+ },
229
+ {
230
+ "epoch": 0.12938702895034773,
231
+ "grad_norm": 0.2546014189720154,
232
+ "learning_rate": 9.335555555555557e-06,
233
+ "loss": 0.0223,
234
+ "step": 800
235
+ },
236
+ {
237
+ "epoch": 0.1334303736050461,
238
+ "grad_norm": 0.20592975616455078,
239
+ "learning_rate": 9.280000000000001e-06,
240
+ "loss": 0.0214,
241
+ "step": 825
242
+ },
243
+ {
244
+ "epoch": 0.13747371825974447,
245
+ "grad_norm": 0.2854577898979187,
246
+ "learning_rate": 9.224444444444445e-06,
247
+ "loss": 0.021,
248
+ "step": 850
249
+ },
250
+ {
251
+ "epoch": 0.14151706291444283,
252
+ "grad_norm": 0.2555174231529236,
253
+ "learning_rate": 9.168888888888889e-06,
254
+ "loss": 0.0216,
255
+ "step": 875
256
+ },
257
+ {
258
+ "epoch": 0.14556040756914118,
259
+ "grad_norm": 0.22724099457263947,
260
+ "learning_rate": 9.113333333333335e-06,
261
+ "loss": 0.0226,
262
+ "step": 900
263
+ },
264
+ {
265
+ "epoch": 0.14960375222383956,
266
+ "grad_norm": 0.24628663063049316,
267
+ "learning_rate": 9.057777777777779e-06,
268
+ "loss": 0.021,
269
+ "step": 925
270
+ },
271
+ {
272
+ "epoch": 0.15364709687853792,
273
+ "grad_norm": 0.22158333659172058,
274
+ "learning_rate": 9.002222222222223e-06,
275
+ "loss": 0.0221,
276
+ "step": 950
277
+ },
278
+ {
279
+ "epoch": 0.1576904415332363,
280
+ "grad_norm": 0.23985975980758667,
281
+ "learning_rate": 8.946666666666669e-06,
282
+ "loss": 0.0227,
283
+ "step": 975
284
+ },
285
+ {
286
+ "epoch": 0.16173378618793466,
287
+ "grad_norm": 0.24263052642345428,
288
+ "learning_rate": 8.891111111111111e-06,
289
+ "loss": 0.021,
290
+ "step": 1000
291
+ },
292
+ {
293
+ "epoch": 0.16173378618793466,
294
+ "eval_loss": 0.026818539947271347,
295
+ "eval_runtime": 15594.6052,
296
+ "eval_samples_per_second": 7.9,
297
+ "eval_steps_per_second": 0.494,
298
+ "eval_wer": 84.01915040370025,
299
+ "step": 1000
300
+ },
301
+ {
302
+ "epoch": 0.165777130842633,
303
+ "grad_norm": 0.24622474610805511,
304
+ "learning_rate": 8.835555555555557e-06,
305
+ "loss": 0.0215,
306
+ "step": 1025
307
+ },
308
+ {
309
+ "epoch": 0.1698204754973314,
310
+ "grad_norm": 0.218623086810112,
311
+ "learning_rate": 8.78e-06,
312
+ "loss": 0.0209,
313
+ "step": 1050
314
+ },
315
+ {
316
+ "epoch": 0.17386382015202975,
317
+ "grad_norm": 0.2270510345697403,
318
+ "learning_rate": 8.724444444444445e-06,
319
+ "loss": 0.0216,
320
+ "step": 1075
321
+ },
322
+ {
323
+ "epoch": 0.17790716480672814,
324
+ "grad_norm": 0.2207878828048706,
325
+ "learning_rate": 8.66888888888889e-06,
326
+ "loss": 0.022,
327
+ "step": 1100
328
+ },
329
+ {
330
+ "epoch": 0.1819505094614265,
331
+ "grad_norm": 0.18590456247329712,
332
+ "learning_rate": 8.613333333333333e-06,
333
+ "loss": 0.022,
334
+ "step": 1125
335
+ },
336
+ {
337
+ "epoch": 0.18599385411612485,
338
+ "grad_norm": 0.1838596612215042,
339
+ "learning_rate": 8.557777777777778e-06,
340
+ "loss": 0.0215,
341
+ "step": 1150
342
+ },
343
+ {
344
+ "epoch": 0.19003719877082323,
345
+ "grad_norm": 0.31669357419013977,
346
+ "learning_rate": 8.502222222222223e-06,
347
+ "loss": 0.0223,
348
+ "step": 1175
349
+ },
350
+ {
351
+ "epoch": 0.19408054342552158,
352
+ "grad_norm": 0.21895891427993774,
353
+ "learning_rate": 8.446666666666668e-06,
354
+ "loss": 0.0209,
355
+ "step": 1200
356
+ },
357
+ {
358
+ "epoch": 0.19812388808021997,
359
+ "grad_norm": 0.20323023200035095,
360
+ "learning_rate": 8.391111111111112e-06,
361
+ "loss": 0.021,
362
+ "step": 1225
363
+ },
364
+ {
365
+ "epoch": 0.20216723273491832,
366
+ "grad_norm": 0.25468146800994873,
367
+ "learning_rate": 8.335555555555556e-06,
368
+ "loss": 0.0224,
369
+ "step": 1250
370
+ },
371
+ {
372
+ "epoch": 0.20621057738961668,
373
+ "grad_norm": 0.20973573625087738,
374
+ "learning_rate": 8.28e-06,
375
+ "loss": 0.0211,
376
+ "step": 1275
377
+ },
378
+ {
379
+ "epoch": 0.21025392204431506,
380
+ "grad_norm": 0.26390501856803894,
381
+ "learning_rate": 8.224444444444444e-06,
382
+ "loss": 0.0208,
383
+ "step": 1300
384
+ },
385
+ {
386
+ "epoch": 0.21429726669901342,
387
+ "grad_norm": 0.2301158905029297,
388
+ "learning_rate": 8.16888888888889e-06,
389
+ "loss": 0.0205,
390
+ "step": 1325
391
+ },
392
+ {
393
+ "epoch": 0.2183406113537118,
394
+ "grad_norm": 0.24158801138401031,
395
+ "learning_rate": 8.113333333333334e-06,
396
+ "loss": 0.0206,
397
+ "step": 1350
398
+ },
399
+ {
400
+ "epoch": 0.22238395600841016,
401
+ "grad_norm": 0.21493862569332123,
402
+ "learning_rate": 8.057777777777778e-06,
403
+ "loss": 0.0208,
404
+ "step": 1375
405
+ },
406
+ {
407
+ "epoch": 0.2264273006631085,
408
+ "grad_norm": 0.2373913675546646,
409
+ "learning_rate": 8.002222222222222e-06,
410
+ "loss": 0.0198,
411
+ "step": 1400
412
+ },
413
+ {
414
+ "epoch": 0.2304706453178069,
415
+ "grad_norm": 0.2382635623216629,
416
+ "learning_rate": 7.946666666666666e-06,
417
+ "loss": 0.0198,
418
+ "step": 1425
419
+ },
420
+ {
421
+ "epoch": 0.23451398997250525,
422
+ "grad_norm": 0.256558895111084,
423
+ "learning_rate": 7.891111111111112e-06,
424
+ "loss": 0.0206,
425
+ "step": 1450
426
+ },
427
+ {
428
+ "epoch": 0.23855733462720363,
429
+ "grad_norm": 0.23155981302261353,
430
+ "learning_rate": 7.835555555555556e-06,
431
+ "loss": 0.0182,
432
+ "step": 1475
433
+ },
434
+ {
435
+ "epoch": 0.242600679281902,
436
+ "grad_norm": 0.24832689762115479,
437
+ "learning_rate": 7.78e-06,
438
+ "loss": 0.02,
439
+ "step": 1500
440
+ },
441
+ {
442
+ "epoch": 0.24664402393660034,
443
+ "grad_norm": 0.22871288657188416,
444
+ "learning_rate": 7.724444444444446e-06,
445
+ "loss": 0.0193,
446
+ "step": 1525
447
+ },
448
+ {
449
+ "epoch": 0.2506873685912987,
450
+ "grad_norm": 0.2521936297416687,
451
+ "learning_rate": 7.66888888888889e-06,
452
+ "loss": 0.0197,
453
+ "step": 1550
454
+ },
455
+ {
456
+ "epoch": 0.2547307132459971,
457
+ "grad_norm": 0.24429504573345184,
458
+ "learning_rate": 7.613333333333334e-06,
459
+ "loss": 0.0221,
460
+ "step": 1575
461
+ },
462
+ {
463
+ "epoch": 0.25877405790069546,
464
+ "grad_norm": 0.23084107041358948,
465
+ "learning_rate": 7.557777777777779e-06,
466
+ "loss": 0.02,
467
+ "step": 1600
468
+ },
469
+ {
470
+ "epoch": 0.2628174025553938,
471
+ "grad_norm": 0.2697436809539795,
472
+ "learning_rate": 7.502222222222223e-06,
473
+ "loss": 0.0211,
474
+ "step": 1625
475
+ },
476
+ {
477
+ "epoch": 0.2668607472100922,
478
+ "grad_norm": 0.23871037364006042,
479
+ "learning_rate": 7.446666666666668e-06,
480
+ "loss": 0.0214,
481
+ "step": 1650
482
+ },
483
+ {
484
+ "epoch": 0.27090409186479053,
485
+ "grad_norm": 0.2677021920681,
486
+ "learning_rate": 7.3911111111111125e-06,
487
+ "loss": 0.0203,
488
+ "step": 1675
489
+ },
490
+ {
491
+ "epoch": 0.27494743651948894,
492
+ "grad_norm": 0.1928839236497879,
493
+ "learning_rate": 7.335555555555556e-06,
494
+ "loss": 0.0203,
495
+ "step": 1700
496
+ },
497
+ {
498
+ "epoch": 0.2789907811741873,
499
+ "grad_norm": 0.26213887333869934,
500
+ "learning_rate": 7.280000000000001e-06,
501
+ "loss": 0.0205,
502
+ "step": 1725
503
+ },
504
+ {
505
+ "epoch": 0.28303412582888565,
506
+ "grad_norm": 0.23492328822612762,
507
+ "learning_rate": 7.224444444444445e-06,
508
+ "loss": 0.0201,
509
+ "step": 1750
510
+ },
511
+ {
512
+ "epoch": 0.287077470483584,
513
+ "grad_norm": 0.23946373164653778,
514
+ "learning_rate": 7.1688888888888895e-06,
515
+ "loss": 0.0195,
516
+ "step": 1775
517
+ },
518
+ {
519
+ "epoch": 0.29112081513828236,
520
+ "grad_norm": 0.20817314088344574,
521
+ "learning_rate": 7.113333333333334e-06,
522
+ "loss": 0.0211,
523
+ "step": 1800
524
+ },
525
+ {
526
+ "epoch": 0.2951641597929808,
527
+ "grad_norm": 0.22937080264091492,
528
+ "learning_rate": 7.057777777777778e-06,
529
+ "loss": 0.0203,
530
+ "step": 1825
531
+ },
532
+ {
533
+ "epoch": 0.29920750444767913,
534
+ "grad_norm": 0.23442518711090088,
535
+ "learning_rate": 7.0022222222222225e-06,
536
+ "loss": 0.0197,
537
+ "step": 1850
538
+ },
539
+ {
540
+ "epoch": 0.3032508491023775,
541
+ "grad_norm": 0.24289068579673767,
542
+ "learning_rate": 6.946666666666667e-06,
543
+ "loss": 0.0198,
544
+ "step": 1875
545
+ },
546
+ {
547
+ "epoch": 0.30729419375707584,
548
+ "grad_norm": 0.24261179566383362,
549
+ "learning_rate": 6.891111111111111e-06,
550
+ "loss": 0.0211,
551
+ "step": 1900
552
+ },
553
+ {
554
+ "epoch": 0.3113375384117742,
555
+ "grad_norm": 0.21574310958385468,
556
+ "learning_rate": 6.835555555555556e-06,
557
+ "loss": 0.0196,
558
+ "step": 1925
559
+ },
560
+ {
561
+ "epoch": 0.3153808830664726,
562
+ "grad_norm": 0.2648760676383972,
563
+ "learning_rate": 6.780000000000001e-06,
564
+ "loss": 0.021,
565
+ "step": 1950
566
+ },
567
+ {
568
+ "epoch": 0.31942422772117096,
569
+ "grad_norm": 0.22739122807979584,
570
+ "learning_rate": 6.724444444444444e-06,
571
+ "loss": 0.019,
572
+ "step": 1975
573
+ },
574
+ {
575
+ "epoch": 0.3234675723758693,
576
+ "grad_norm": 0.2167060226202011,
577
+ "learning_rate": 6.668888888888889e-06,
578
+ "loss": 0.0208,
579
+ "step": 2000
580
+ },
581
+ {
582
+ "epoch": 0.3234675723758693,
583
+ "eval_loss": 0.025321291759610176,
584
+ "eval_runtime": 14275.0619,
585
+ "eval_samples_per_second": 8.63,
586
+ "eval_steps_per_second": 0.539,
587
+ "eval_wer": 82.5098389256299,
588
+ "step": 2000
589
+ },
590
+ {
591
+ "epoch": 0.32751091703056767,
592
+ "grad_norm": 0.2261808067560196,
593
+ "learning_rate": 6.613333333333334e-06,
594
+ "loss": 0.0192,
595
+ "step": 2025
596
+ },
597
+ {
598
+ "epoch": 0.331554261685266,
599
+ "grad_norm": 0.26066452264785767,
600
+ "learning_rate": 6.557777777777778e-06,
601
+ "loss": 0.0222,
602
+ "step": 2050
603
+ },
604
+ {
605
+ "epoch": 0.33559760633996444,
606
+ "grad_norm": 0.22417497634887695,
607
+ "learning_rate": 6.502222222222223e-06,
608
+ "loss": 0.0204,
609
+ "step": 2075
610
+ },
611
+ {
612
+ "epoch": 0.3396409509946628,
613
+ "grad_norm": 0.20988379418849945,
614
+ "learning_rate": 6.446666666666668e-06,
615
+ "loss": 0.0191,
616
+ "step": 2100
617
+ },
618
+ {
619
+ "epoch": 0.34368429564936115,
620
+ "grad_norm": 0.22671236097812653,
621
+ "learning_rate": 6.391111111111111e-06,
622
+ "loss": 0.0211,
623
+ "step": 2125
624
+ },
625
+ {
626
+ "epoch": 0.3477276403040595,
627
+ "grad_norm": 0.2695440351963043,
628
+ "learning_rate": 6.335555555555556e-06,
629
+ "loss": 0.0197,
630
+ "step": 2150
631
+ },
632
+ {
633
+ "epoch": 0.35177098495875786,
634
+ "grad_norm": 0.20911026000976562,
635
+ "learning_rate": 6.280000000000001e-06,
636
+ "loss": 0.0191,
637
+ "step": 2175
638
+ },
639
+ {
640
+ "epoch": 0.35581432961345627,
641
+ "grad_norm": 0.21661245822906494,
642
+ "learning_rate": 6.224444444444445e-06,
643
+ "loss": 0.0199,
644
+ "step": 2200
645
+ },
646
+ {
647
+ "epoch": 0.3598576742681546,
648
+ "grad_norm": 0.222182035446167,
649
+ "learning_rate": 6.16888888888889e-06,
650
+ "loss": 0.0195,
651
+ "step": 2225
652
+ },
653
+ {
654
+ "epoch": 0.363901018922853,
655
+ "grad_norm": 0.2006169855594635,
656
+ "learning_rate": 6.113333333333333e-06,
657
+ "loss": 0.0199,
658
+ "step": 2250
659
+ },
660
+ {
661
+ "epoch": 0.36794436357755134,
662
+ "grad_norm": 0.22298727929592133,
663
+ "learning_rate": 6.057777777777778e-06,
664
+ "loss": 0.02,
665
+ "step": 2275
666
+ },
667
+ {
668
+ "epoch": 0.3719877082322497,
669
+ "grad_norm": 0.19001857936382294,
670
+ "learning_rate": 6.002222222222223e-06,
671
+ "loss": 0.0197,
672
+ "step": 2300
673
+ },
674
+ {
675
+ "epoch": 0.3760310528869481,
676
+ "grad_norm": 0.2434571385383606,
677
+ "learning_rate": 5.946666666666668e-06,
678
+ "loss": 0.021,
679
+ "step": 2325
680
+ },
681
+ {
682
+ "epoch": 0.38007439754164646,
683
+ "grad_norm": 0.21516510844230652,
684
+ "learning_rate": 5.891111111111112e-06,
685
+ "loss": 0.0193,
686
+ "step": 2350
687
+ },
688
+ {
689
+ "epoch": 0.3841177421963448,
690
+ "grad_norm": 0.24581994116306305,
691
+ "learning_rate": 5.8355555555555565e-06,
692
+ "loss": 0.0199,
693
+ "step": 2375
694
+ },
695
+ {
696
+ "epoch": 0.38816108685104317,
697
+ "grad_norm": 0.23675447702407837,
698
+ "learning_rate": 5.78e-06,
699
+ "loss": 0.0195,
700
+ "step": 2400
701
+ },
702
+ {
703
+ "epoch": 0.3922044315057415,
704
+ "grad_norm": 0.21948282420635223,
705
+ "learning_rate": 5.724444444444445e-06,
706
+ "loss": 0.0194,
707
+ "step": 2425
708
+ },
709
+ {
710
+ "epoch": 0.39624777616043994,
711
+ "grad_norm": 0.21700553596019745,
712
+ "learning_rate": 5.6688888888888895e-06,
713
+ "loss": 0.0188,
714
+ "step": 2450
715
+ },
716
+ {
717
+ "epoch": 0.4002911208151383,
718
+ "grad_norm": 0.24019096791744232,
719
+ "learning_rate": 5.613333333333334e-06,
720
+ "loss": 0.0211,
721
+ "step": 2475
722
+ },
723
+ {
724
+ "epoch": 0.40433446546983665,
725
+ "grad_norm": 0.21174991130828857,
726
+ "learning_rate": 5.557777777777778e-06,
727
+ "loss": 0.0207,
728
+ "step": 2500
729
+ },
730
+ {
731
+ "epoch": 0.408377810124535,
732
+ "grad_norm": 0.2548556625843048,
733
+ "learning_rate": 5.5022222222222224e-06,
734
+ "loss": 0.0202,
735
+ "step": 2525
736
+ },
737
+ {
738
+ "epoch": 0.41242115477923336,
739
+ "grad_norm": 0.2106471061706543,
740
+ "learning_rate": 5.4466666666666665e-06,
741
+ "loss": 0.0206,
742
+ "step": 2550
743
+ },
744
+ {
745
+ "epoch": 0.41646449943393177,
746
+ "grad_norm": 0.20076259970664978,
747
+ "learning_rate": 5.391111111111111e-06,
748
+ "loss": 0.0188,
749
+ "step": 2575
750
+ },
751
+ {
752
+ "epoch": 0.4205078440886301,
753
+ "grad_norm": 0.20281557738780975,
754
+ "learning_rate": 5.335555555555556e-06,
755
+ "loss": 0.0189,
756
+ "step": 2600
757
+ },
758
+ {
759
+ "epoch": 0.4245511887433285,
760
+ "grad_norm": 0.23585236072540283,
761
+ "learning_rate": 5.28e-06,
762
+ "loss": 0.0193,
763
+ "step": 2625
764
+ },
765
+ {
766
+ "epoch": 0.42859453339802683,
767
+ "grad_norm": 0.20076635479927063,
768
+ "learning_rate": 5.224444444444445e-06,
769
+ "loss": 0.0188,
770
+ "step": 2650
771
+ },
772
+ {
773
+ "epoch": 0.4326378780527252,
774
+ "grad_norm": 0.20363172888755798,
775
+ "learning_rate": 5.168888888888889e-06,
776
+ "loss": 0.0188,
777
+ "step": 2675
778
+ },
779
+ {
780
+ "epoch": 0.4366812227074236,
781
+ "grad_norm": 0.20968079566955566,
782
+ "learning_rate": 5.113333333333333e-06,
783
+ "loss": 0.0198,
784
+ "step": 2700
785
+ },
786
+ {
787
+ "epoch": 0.44072456736212196,
788
+ "grad_norm": 0.2075379192829132,
789
+ "learning_rate": 5.057777777777778e-06,
790
+ "loss": 0.0192,
791
+ "step": 2725
792
+ },
793
+ {
794
+ "epoch": 0.4447679120168203,
795
+ "grad_norm": 0.19762763381004333,
796
+ "learning_rate": 5.002222222222223e-06,
797
+ "loss": 0.0192,
798
+ "step": 2750
799
+ },
800
+ {
801
+ "epoch": 0.44881125667151867,
802
+ "grad_norm": 0.2090006172657013,
803
+ "learning_rate": 4.946666666666667e-06,
804
+ "loss": 0.0196,
805
+ "step": 2775
806
+ },
807
+ {
808
+ "epoch": 0.452854601326217,
809
+ "grad_norm": 0.20064634084701538,
810
+ "learning_rate": 4.891111111111111e-06,
811
+ "loss": 0.0187,
812
+ "step": 2800
813
+ },
814
+ {
815
+ "epoch": 0.45689794598091543,
816
+ "grad_norm": 0.20843106508255005,
817
+ "learning_rate": 4.835555555555556e-06,
818
+ "loss": 0.0192,
819
+ "step": 2825
820
+ },
821
+ {
822
+ "epoch": 0.4609412906356138,
823
+ "grad_norm": 0.22503884136676788,
824
+ "learning_rate": 4.78e-06,
825
+ "loss": 0.0195,
826
+ "step": 2850
827
+ },
828
+ {
829
+ "epoch": 0.46498463529031214,
830
+ "grad_norm": 0.22823262214660645,
831
+ "learning_rate": 4.724444444444445e-06,
832
+ "loss": 0.0199,
833
+ "step": 2875
834
+ },
835
+ {
836
+ "epoch": 0.4690279799450105,
837
+ "grad_norm": 0.24186566472053528,
838
+ "learning_rate": 4.66888888888889e-06,
839
+ "loss": 0.0194,
840
+ "step": 2900
841
+ },
842
+ {
843
+ "epoch": 0.47307132459970885,
844
+ "grad_norm": 0.27571889758110046,
845
+ "learning_rate": 4.613333333333334e-06,
846
+ "loss": 0.0194,
847
+ "step": 2925
848
+ },
849
+ {
850
+ "epoch": 0.47711466925440726,
851
+ "grad_norm": 0.22955967485904694,
852
+ "learning_rate": 4.557777777777778e-06,
853
+ "loss": 0.0201,
854
+ "step": 2950
855
+ },
856
+ {
857
+ "epoch": 0.4811580139091056,
858
+ "grad_norm": 0.21334044635295868,
859
+ "learning_rate": 4.502222222222223e-06,
860
+ "loss": 0.0204,
861
+ "step": 2975
862
+ },
863
+ {
864
+ "epoch": 0.485201358563804,
865
+ "grad_norm": 0.20334798097610474,
866
+ "learning_rate": 4.446666666666667e-06,
867
+ "loss": 0.0207,
868
+ "step": 3000
869
+ },
870
+ {
871
+ "epoch": 0.485201358563804,
872
+ "eval_loss": 0.024814918637275696,
873
+ "eval_runtime": 14244.4286,
874
+ "eval_samples_per_second": 8.649,
875
+ "eval_steps_per_second": 0.541,
876
+ "eval_wer": 81.96210492149146,
877
+ "step": 3000
878
+ },
879
+ {
880
+ "epoch": 0.48924470321850233,
881
+ "grad_norm": 0.23993884027004242,
882
+ "learning_rate": 4.391111111111112e-06,
883
+ "loss": 0.0192,
884
+ "step": 3025
885
+ },
886
+ {
887
+ "epoch": 0.4932880478732007,
888
+ "grad_norm": 0.19718846678733826,
889
+ "learning_rate": 4.3355555555555565e-06,
890
+ "loss": 0.0202,
891
+ "step": 3050
892
+ },
893
+ {
894
+ "epoch": 0.4973313925278991,
895
+ "grad_norm": 0.21749120950698853,
896
+ "learning_rate": 4.2800000000000005e-06,
897
+ "loss": 0.0191,
898
+ "step": 3075
899
+ },
900
+ {
901
+ "epoch": 0.5013747371825974,
902
+ "grad_norm": 0.22544993460178375,
903
+ "learning_rate": 4.2244444444444446e-06,
904
+ "loss": 0.0189,
905
+ "step": 3100
906
+ },
907
+ {
908
+ "epoch": 0.5054180818372959,
909
+ "grad_norm": 0.20528769493103027,
910
+ "learning_rate": 4.168888888888889e-06,
911
+ "loss": 0.0195,
912
+ "step": 3125
913
+ },
914
+ {
915
+ "epoch": 0.5094614264919942,
916
+ "grad_norm": 0.2464672178030014,
917
+ "learning_rate": 4.1133333333333335e-06,
918
+ "loss": 0.0184,
919
+ "step": 3150
920
+ },
921
+ {
922
+ "epoch": 0.5135047711466926,
923
+ "grad_norm": 0.22547754645347595,
924
+ "learning_rate": 4.057777777777778e-06,
925
+ "loss": 0.0202,
926
+ "step": 3175
927
+ },
928
+ {
929
+ "epoch": 0.5175481158013909,
930
+ "grad_norm": 0.1981554925441742,
931
+ "learning_rate": 4.002222222222222e-06,
932
+ "loss": 0.0196,
933
+ "step": 3200
934
+ },
935
+ {
936
+ "epoch": 0.5215914604560893,
937
+ "grad_norm": 0.23502561450004578,
938
+ "learning_rate": 3.946666666666667e-06,
939
+ "loss": 0.0188,
940
+ "step": 3225
941
+ },
942
+ {
943
+ "epoch": 0.5256348051107876,
944
+ "grad_norm": 0.23244182765483856,
945
+ "learning_rate": 3.891111111111111e-06,
946
+ "loss": 0.0194,
947
+ "step": 3250
948
+ },
949
+ {
950
+ "epoch": 0.529678149765486,
951
+ "grad_norm": 0.23334389925003052,
952
+ "learning_rate": 3.835555555555555e-06,
953
+ "loss": 0.019,
954
+ "step": 3275
955
+ },
956
+ {
957
+ "epoch": 0.5337214944201844,
958
+ "grad_norm": 0.24255788326263428,
959
+ "learning_rate": 3.7800000000000002e-06,
960
+ "loss": 0.019,
961
+ "step": 3300
962
+ },
963
+ {
964
+ "epoch": 0.5377648390748827,
965
+ "grad_norm": 0.19664430618286133,
966
+ "learning_rate": 3.724444444444445e-06,
967
+ "loss": 0.0186,
968
+ "step": 3325
969
+ },
970
+ {
971
+ "epoch": 0.5418081837295811,
972
+ "grad_norm": 0.23343808948993683,
973
+ "learning_rate": 3.668888888888889e-06,
974
+ "loss": 0.0185,
975
+ "step": 3350
976
+ },
977
+ {
978
+ "epoch": 0.5458515283842795,
979
+ "grad_norm": 0.22261999547481537,
980
+ "learning_rate": 3.6133333333333336e-06,
981
+ "loss": 0.0185,
982
+ "step": 3375
983
+ },
984
+ {
985
+ "epoch": 0.5498948730389779,
986
+ "grad_norm": 0.26387473940849304,
987
+ "learning_rate": 3.5577777777777785e-06,
988
+ "loss": 0.0191,
989
+ "step": 3400
990
+ },
991
+ {
992
+ "epoch": 0.5539382176936762,
993
+ "grad_norm": 0.2524121105670929,
994
+ "learning_rate": 3.5022222222222225e-06,
995
+ "loss": 0.019,
996
+ "step": 3425
997
+ },
998
+ {
999
+ "epoch": 0.5579815623483746,
1000
+ "grad_norm": 0.21356618404388428,
1001
+ "learning_rate": 3.446666666666667e-06,
1002
+ "loss": 0.0202,
1003
+ "step": 3450
1004
+ },
1005
+ {
1006
+ "epoch": 0.562024907003073,
1007
+ "grad_norm": 0.20150095224380493,
1008
+ "learning_rate": 3.391111111111111e-06,
1009
+ "loss": 0.0199,
1010
+ "step": 3475
1011
+ },
1012
+ {
1013
+ "epoch": 0.5660682516577713,
1014
+ "grad_norm": 0.20904186367988586,
1015
+ "learning_rate": 3.335555555555556e-06,
1016
+ "loss": 0.0187,
1017
+ "step": 3500
1018
+ },
1019
+ {
1020
+ "epoch": 0.5701115963124697,
1021
+ "grad_norm": 0.2027483582496643,
1022
+ "learning_rate": 3.2800000000000004e-06,
1023
+ "loss": 0.0184,
1024
+ "step": 3525
1025
+ },
1026
+ {
1027
+ "epoch": 0.574154940967168,
1028
+ "grad_norm": 0.23289534449577332,
1029
+ "learning_rate": 3.2244444444444444e-06,
1030
+ "loss": 0.0196,
1031
+ "step": 3550
1032
+ },
1033
+ {
1034
+ "epoch": 0.5781982856218664,
1035
+ "grad_norm": 0.23588407039642334,
1036
+ "learning_rate": 3.1688888888888893e-06,
1037
+ "loss": 0.0179,
1038
+ "step": 3575
1039
+ },
1040
+ {
1041
+ "epoch": 0.5822416302765647,
1042
+ "grad_norm": 0.2624644935131073,
1043
+ "learning_rate": 3.1133333333333337e-06,
1044
+ "loss": 0.0186,
1045
+ "step": 3600
1046
+ },
1047
+ {
1048
+ "epoch": 0.5862849749312632,
1049
+ "grad_norm": 0.21287862956523895,
1050
+ "learning_rate": 3.0577777777777778e-06,
1051
+ "loss": 0.0192,
1052
+ "step": 3625
1053
+ },
1054
+ {
1055
+ "epoch": 0.5903283195859615,
1056
+ "grad_norm": 0.24328123033046722,
1057
+ "learning_rate": 3.0022222222222227e-06,
1058
+ "loss": 0.0191,
1059
+ "step": 3650
1060
+ },
1061
+ {
1062
+ "epoch": 0.5943716642406599,
1063
+ "grad_norm": 0.1968221366405487,
1064
+ "learning_rate": 2.946666666666667e-06,
1065
+ "loss": 0.0189,
1066
+ "step": 3675
1067
+ },
1068
+ {
1069
+ "epoch": 0.5984150088953583,
1070
+ "grad_norm": 0.22048558294773102,
1071
+ "learning_rate": 2.891111111111111e-06,
1072
+ "loss": 0.0193,
1073
+ "step": 3700
1074
+ },
1075
+ {
1076
+ "epoch": 0.6024583535500566,
1077
+ "grad_norm": 0.23322191834449768,
1078
+ "learning_rate": 2.835555555555556e-06,
1079
+ "loss": 0.0203,
1080
+ "step": 3725
1081
+ },
1082
+ {
1083
+ "epoch": 0.606501698204755,
1084
+ "grad_norm": 0.2503873407840729,
1085
+ "learning_rate": 2.7800000000000005e-06,
1086
+ "loss": 0.0187,
1087
+ "step": 3750
1088
+ },
1089
+ {
1090
+ "epoch": 0.6105450428594533,
1091
+ "grad_norm": 0.23583608865737915,
1092
+ "learning_rate": 2.7244444444444445e-06,
1093
+ "loss": 0.0182,
1094
+ "step": 3775
1095
+ },
1096
+ {
1097
+ "epoch": 0.6145883875141517,
1098
+ "grad_norm": 0.21786954998970032,
1099
+ "learning_rate": 2.6688888888888894e-06,
1100
+ "loss": 0.0187,
1101
+ "step": 3800
1102
+ },
1103
+ {
1104
+ "epoch": 0.61863173216885,
1105
+ "grad_norm": 0.21708305180072784,
1106
+ "learning_rate": 2.6133333333333334e-06,
1107
+ "loss": 0.0207,
1108
+ "step": 3825
1109
+ },
1110
+ {
1111
+ "epoch": 0.6226750768235484,
1112
+ "grad_norm": 0.21468117833137512,
1113
+ "learning_rate": 2.557777777777778e-06,
1114
+ "loss": 0.0206,
1115
+ "step": 3850
1116
+ },
1117
+ {
1118
+ "epoch": 0.6267184214782469,
1119
+ "grad_norm": 0.2351790964603424,
1120
+ "learning_rate": 2.5022222222222224e-06,
1121
+ "loss": 0.0184,
1122
+ "step": 3875
1123
+ },
1124
+ {
1125
+ "epoch": 0.6307617661329452,
1126
+ "grad_norm": 0.20362141728401184,
1127
+ "learning_rate": 2.446666666666667e-06,
1128
+ "loss": 0.0181,
1129
+ "step": 3900
1130
+ },
1131
+ {
1132
+ "epoch": 0.6348051107876436,
1133
+ "grad_norm": 0.22791029512882233,
1134
+ "learning_rate": 2.3911111111111113e-06,
1135
+ "loss": 0.0176,
1136
+ "step": 3925
1137
+ },
1138
+ {
1139
+ "epoch": 0.6388484554423419,
1140
+ "grad_norm": 0.23184864223003387,
1141
+ "learning_rate": 2.3355555555555557e-06,
1142
+ "loss": 0.0201,
1143
+ "step": 3950
1144
+ },
1145
+ {
1146
+ "epoch": 0.6428918000970403,
1147
+ "grad_norm": 0.213025763630867,
1148
+ "learning_rate": 2.28e-06,
1149
+ "loss": 0.0178,
1150
+ "step": 3975
1151
+ },
1152
+ {
1153
+ "epoch": 0.6469351447517386,
1154
+ "grad_norm": 0.2062954306602478,
1155
+ "learning_rate": 2.2244444444444447e-06,
1156
+ "loss": 0.0198,
1157
+ "step": 4000
1158
+ },
1159
+ {
1160
+ "epoch": 0.6469351447517386,
1161
+ "eval_loss": 0.02430112473666668,
1162
+ "eval_runtime": 14235.4068,
1163
+ "eval_samples_per_second": 8.654,
1164
+ "eval_steps_per_second": 0.541,
1165
+ "eval_wer": 81.67647178155556,
1166
+ "step": 4000
1167
+ },
1168
+ {
1169
+ "epoch": 0.650978489406437,
1170
+ "grad_norm": 0.22564095258712769,
1171
+ "learning_rate": 2.168888888888889e-06,
1172
+ "loss": 0.0192,
1173
+ "step": 4025
1174
+ },
1175
+ {
1176
+ "epoch": 0.6550218340611353,
1177
+ "grad_norm": 0.21880663931369781,
1178
+ "learning_rate": 2.1133333333333336e-06,
1179
+ "loss": 0.0196,
1180
+ "step": 4050
1181
+ },
1182
+ {
1183
+ "epoch": 0.6590651787158337,
1184
+ "grad_norm": 0.22380299866199493,
1185
+ "learning_rate": 2.057777777777778e-06,
1186
+ "loss": 0.0193,
1187
+ "step": 4075
1188
+ },
1189
+ {
1190
+ "epoch": 0.663108523370532,
1191
+ "grad_norm": 0.16602382063865662,
1192
+ "learning_rate": 2.0022222222222225e-06,
1193
+ "loss": 0.0177,
1194
+ "step": 4100
1195
+ },
1196
+ {
1197
+ "epoch": 0.6671518680252305,
1198
+ "grad_norm": 0.2158273607492447,
1199
+ "learning_rate": 1.9466666666666665e-06,
1200
+ "loss": 0.0193,
1201
+ "step": 4125
1202
+ },
1203
+ {
1204
+ "epoch": 0.6711952126799289,
1205
+ "grad_norm": 0.22045934200286865,
1206
+ "learning_rate": 1.8911111111111114e-06,
1207
+ "loss": 0.0187,
1208
+ "step": 4150
1209
+ },
1210
+ {
1211
+ "epoch": 0.6752385573346272,
1212
+ "grad_norm": 0.22605691850185394,
1213
+ "learning_rate": 1.8355555555555557e-06,
1214
+ "loss": 0.0189,
1215
+ "step": 4175
1216
+ },
1217
+ {
1218
+ "epoch": 0.6792819019893256,
1219
+ "grad_norm": 0.2112116664648056,
1220
+ "learning_rate": 1.7800000000000001e-06,
1221
+ "loss": 0.0189,
1222
+ "step": 4200
1223
+ },
1224
+ {
1225
+ "epoch": 0.6833252466440239,
1226
+ "grad_norm": 0.22499610483646393,
1227
+ "learning_rate": 1.7244444444444448e-06,
1228
+ "loss": 0.0197,
1229
+ "step": 4225
1230
+ },
1231
+ {
1232
+ "epoch": 0.6873685912987223,
1233
+ "grad_norm": 0.21851369738578796,
1234
+ "learning_rate": 1.668888888888889e-06,
1235
+ "loss": 0.0194,
1236
+ "step": 4250
1237
+ },
1238
+ {
1239
+ "epoch": 0.6914119359534207,
1240
+ "grad_norm": 0.25461533665657043,
1241
+ "learning_rate": 1.6133333333333335e-06,
1242
+ "loss": 0.0181,
1243
+ "step": 4275
1244
+ },
1245
+ {
1246
+ "epoch": 0.695455280608119,
1247
+ "grad_norm": 0.19757238030433655,
1248
+ "learning_rate": 1.5577777777777777e-06,
1249
+ "loss": 0.0188,
1250
+ "step": 4300
1251
+ },
1252
+ {
1253
+ "epoch": 0.6994986252628174,
1254
+ "grad_norm": 0.16863584518432617,
1255
+ "learning_rate": 1.5022222222222224e-06,
1256
+ "loss": 0.0183,
1257
+ "step": 4325
1258
+ },
1259
+ {
1260
+ "epoch": 0.7035419699175157,
1261
+ "grad_norm": 0.21036742627620697,
1262
+ "learning_rate": 1.4466666666666669e-06,
1263
+ "loss": 0.0189,
1264
+ "step": 4350
1265
+ },
1266
+ {
1267
+ "epoch": 0.7075853145722142,
1268
+ "grad_norm": 0.2173764854669571,
1269
+ "learning_rate": 1.3911111111111111e-06,
1270
+ "loss": 0.0172,
1271
+ "step": 4375
1272
+ },
1273
+ {
1274
+ "epoch": 0.7116286592269125,
1275
+ "grad_norm": 0.2053259015083313,
1276
+ "learning_rate": 1.3355555555555558e-06,
1277
+ "loss": 0.0188,
1278
+ "step": 4400
1279
+ },
1280
+ {
1281
+ "epoch": 0.7156720038816109,
1282
+ "grad_norm": 0.22805319726467133,
1283
+ "learning_rate": 1.28e-06,
1284
+ "loss": 0.019,
1285
+ "step": 4425
1286
+ },
1287
+ {
1288
+ "epoch": 0.7197153485363093,
1289
+ "grad_norm": 0.20309989154338837,
1290
+ "learning_rate": 1.2244444444444445e-06,
1291
+ "loss": 0.0184,
1292
+ "step": 4450
1293
+ },
1294
+ {
1295
+ "epoch": 0.7237586931910076,
1296
+ "grad_norm": 0.20786143839359283,
1297
+ "learning_rate": 1.168888888888889e-06,
1298
+ "loss": 0.0177,
1299
+ "step": 4475
1300
+ },
1301
+ {
1302
+ "epoch": 0.727802037845706,
1303
+ "grad_norm": 0.17660555243492126,
1304
+ "learning_rate": 1.1133333333333334e-06,
1305
+ "loss": 0.0185,
1306
+ "step": 4500
1307
+ },
1308
+ {
1309
+ "epoch": 0.7318453825004043,
1310
+ "grad_norm": 0.25932908058166504,
1311
+ "learning_rate": 1.0577777777777779e-06,
1312
+ "loss": 0.0189,
1313
+ "step": 4525
1314
+ },
1315
+ {
1316
+ "epoch": 0.7358887271551027,
1317
+ "grad_norm": 0.1857845038175583,
1318
+ "learning_rate": 1.0022222222222223e-06,
1319
+ "loss": 0.018,
1320
+ "step": 4550
1321
+ },
1322
+ {
1323
+ "epoch": 0.739932071809801,
1324
+ "grad_norm": 0.17438913881778717,
1325
+ "learning_rate": 9.466666666666667e-07,
1326
+ "loss": 0.0181,
1327
+ "step": 4575
1328
+ },
1329
+ {
1330
+ "epoch": 0.7439754164644994,
1331
+ "grad_norm": 0.2224661409854889,
1332
+ "learning_rate": 8.911111111111112e-07,
1333
+ "loss": 0.0181,
1334
+ "step": 4600
1335
+ },
1336
+ {
1337
+ "epoch": 0.7480187611191979,
1338
+ "grad_norm": 0.2210237681865692,
1339
+ "learning_rate": 8.355555555555556e-07,
1340
+ "loss": 0.0183,
1341
+ "step": 4625
1342
+ },
1343
+ {
1344
+ "epoch": 0.7520621057738962,
1345
+ "grad_norm": 0.20891518890857697,
1346
+ "learning_rate": 7.8e-07,
1347
+ "loss": 0.019,
1348
+ "step": 4650
1349
+ },
1350
+ {
1351
+ "epoch": 0.7561054504285946,
1352
+ "grad_norm": 0.2342623621225357,
1353
+ "learning_rate": 7.244444444444446e-07,
1354
+ "loss": 0.0192,
1355
+ "step": 4675
1356
+ },
1357
+ {
1358
+ "epoch": 0.7601487950832929,
1359
+ "grad_norm": 0.22586888074874878,
1360
+ "learning_rate": 6.68888888888889e-07,
1361
+ "loss": 0.0187,
1362
+ "step": 4700
1363
+ },
1364
+ {
1365
+ "epoch": 0.7641921397379913,
1366
+ "grad_norm": 0.21685567498207092,
1367
+ "learning_rate": 6.133333333333333e-07,
1368
+ "loss": 0.019,
1369
+ "step": 4725
1370
+ },
1371
+ {
1372
+ "epoch": 0.7682354843926896,
1373
+ "grad_norm": 0.19827619194984436,
1374
+ "learning_rate": 5.577777777777779e-07,
1375
+ "loss": 0.0189,
1376
+ "step": 4750
1377
+ },
1378
+ {
1379
+ "epoch": 0.772278829047388,
1380
+ "grad_norm": 0.21666119992733002,
1381
+ "learning_rate": 5.022222222222222e-07,
1382
+ "loss": 0.0189,
1383
+ "step": 4775
1384
+ },
1385
+ {
1386
+ "epoch": 0.7763221737020863,
1387
+ "grad_norm": 0.2064637690782547,
1388
+ "learning_rate": 4.466666666666667e-07,
1389
+ "loss": 0.0185,
1390
+ "step": 4800
1391
+ },
1392
+ {
1393
+ "epoch": 0.7803655183567847,
1394
+ "grad_norm": 0.21813035011291504,
1395
+ "learning_rate": 3.9111111111111115e-07,
1396
+ "loss": 0.0201,
1397
+ "step": 4825
1398
+ },
1399
+ {
1400
+ "epoch": 0.784408863011483,
1401
+ "grad_norm": 0.2053181678056717,
1402
+ "learning_rate": 3.3555555555555556e-07,
1403
+ "loss": 0.019,
1404
+ "step": 4850
1405
+ },
1406
+ {
1407
+ "epoch": 0.7884522076661815,
1408
+ "grad_norm": 0.2073875218629837,
1409
+ "learning_rate": 2.8e-07,
1410
+ "loss": 0.0175,
1411
+ "step": 4875
1412
+ },
1413
+ {
1414
+ "epoch": 0.7924955523208799,
1415
+ "grad_norm": 0.22397322952747345,
1416
+ "learning_rate": 2.2444444444444445e-07,
1417
+ "loss": 0.0187,
1418
+ "step": 4900
1419
+ },
1420
+ {
1421
+ "epoch": 0.7965388969755782,
1422
+ "grad_norm": 0.22646521031856537,
1423
+ "learning_rate": 1.6888888888888888e-07,
1424
+ "loss": 0.0187,
1425
+ "step": 4925
1426
+ },
1427
+ {
1428
+ "epoch": 0.8005822416302766,
1429
+ "grad_norm": 0.24352099001407623,
1430
+ "learning_rate": 1.1333333333333336e-07,
1431
+ "loss": 0.0195,
1432
+ "step": 4950
1433
+ },
1434
+ {
1435
+ "epoch": 0.8046255862849749,
1436
+ "grad_norm": 0.20547421276569366,
1437
+ "learning_rate": 5.777777777777778e-08,
1438
+ "loss": 0.0175,
1439
+ "step": 4975
1440
+ },
1441
+ {
1442
+ "epoch": 0.8086689309396733,
1443
+ "grad_norm": 0.19414258003234863,
1444
+ "learning_rate": 2.2222222222222225e-09,
1445
+ "loss": 0.0187,
1446
+ "step": 5000
1447
+ },
1448
+ {
1449
+ "epoch": 0.8086689309396733,
1450
+ "eval_loss": 0.024050358682870865,
1451
+ "eval_runtime": 14151.7244,
1452
+ "eval_samples_per_second": 8.705,
1453
+ "eval_steps_per_second": 0.544,
1454
+ "eval_wer": 81.25857102284255,
1455
+ "step": 5000
1456
+ }
1457
+ ],
1458
+ "logging_steps": 25,
1459
+ "max_steps": 5000,
1460
+ "num_input_tokens_seen": 0,
1461
+ "num_train_epochs": 1,
1462
+ "save_steps": 1000,
1463
+ "stateful_callbacks": {
1464
+ "TrainerControl": {
1465
+ "args": {
1466
+ "should_epoch_stop": false,
1467
+ "should_evaluate": false,
1468
+ "should_log": false,
1469
+ "should_save": true,
1470
+ "should_training_stop": true
1471
+ },
1472
+ "attributes": {}
1473
+ }
1474
+ },
1475
+ "total_flos": 4.61736640512e+19,
1476
+ "train_batch_size": 32,
1477
+ "trial_name": null,
1478
+ "trial_params": null
1479
+ }
checkpoint-5000/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f9f7caca91e9e74c1f1f32c6f743e7a01e94bbb4a987606d363be4bc023f04fb
3
+ size 5496
config.json ADDED
@@ -0,0 +1,60 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "activation_dropout": 0.0,
3
+ "activation_function": "gelu",
4
+ "apply_spec_augment": false,
5
+ "architectures": [
6
+ "WhisperForConditionalGeneration"
7
+ ],
8
+ "attention_dropout": 0.0,
9
+ "begin_suppress_tokens": null,
10
+ "bos_token_id": 50257,
11
+ "classifier_proj_size": 256,
12
+ "d_model": 768,
13
+ "decoder_attention_heads": 12,
14
+ "decoder_ffn_dim": 3072,
15
+ "decoder_layerdrop": 0.0,
16
+ "decoder_layers": 12,
17
+ "decoder_start_token_id": 50258,
18
+ "dropout": 0.0,
19
+ "encoder_attention_heads": 12,
20
+ "encoder_ffn_dim": 3072,
21
+ "encoder_layerdrop": 0.0,
22
+ "encoder_layers": 12,
23
+ "eos_token_id": 50257,
24
+ "forced_decoder_ids": [
25
+ [
26
+ 1,
27
+ 50259
28
+ ],
29
+ [
30
+ 2,
31
+ 50359
32
+ ],
33
+ [
34
+ 3,
35
+ 50363
36
+ ]
37
+ ],
38
+ "init_std": 0.02,
39
+ "is_encoder_decoder": true,
40
+ "mask_feature_length": 10,
41
+ "mask_feature_min_masks": 0,
42
+ "mask_feature_prob": 0.0,
43
+ "mask_time_length": 10,
44
+ "mask_time_min_masks": 2,
45
+ "mask_time_prob": 0.05,
46
+ "max_length": null,
47
+ "max_source_positions": 1500,
48
+ "max_target_positions": 448,
49
+ "median_filter_width": 7,
50
+ "model_type": "whisper",
51
+ "num_hidden_layers": 12,
52
+ "num_mel_bins": 80,
53
+ "pad_token_id": 50257,
54
+ "scale_embedding": false,
55
+ "torch_dtype": "float32",
56
+ "transformers_version": "4.51.3",
57
+ "use_cache": true,
58
+ "use_weighted_layer_sum": false,
59
+ "vocab_size": 51865
60
+ }
generation_config.json ADDED
@@ -0,0 +1,254 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alignment_heads": [
3
+ [
4
+ 5,
5
+ 3
6
+ ],
7
+ [
8
+ 5,
9
+ 9
10
+ ],
11
+ [
12
+ 8,
13
+ 0
14
+ ],
15
+ [
16
+ 8,
17
+ 4
18
+ ],
19
+ [
20
+ 8,
21
+ 7
22
+ ],
23
+ [
24
+ 8,
25
+ 8
26
+ ],
27
+ [
28
+ 9,
29
+ 0
30
+ ],
31
+ [
32
+ 9,
33
+ 7
34
+ ],
35
+ [
36
+ 9,
37
+ 9
38
+ ],
39
+ [
40
+ 10,
41
+ 5
42
+ ]
43
+ ],
44
+ "begin_suppress_tokens": [
45
+ 220,
46
+ 50257
47
+ ],
48
+ "bos_token_id": 50257,
49
+ "decoder_start_token_id": 50258,
50
+ "eos_token_id": 50257,
51
+ "is_multilingual": true,
52
+ "lang_to_id": {
53
+ "<|af|>": 50327,
54
+ "<|am|>": 50334,
55
+ "<|ar|>": 50272,
56
+ "<|as|>": 50350,
57
+ "<|az|>": 50304,
58
+ "<|ba|>": 50355,
59
+ "<|be|>": 50330,
60
+ "<|bg|>": 50292,
61
+ "<|bn|>": 50302,
62
+ "<|bo|>": 50347,
63
+ "<|br|>": 50309,
64
+ "<|bs|>": 50315,
65
+ "<|ca|>": 50270,
66
+ "<|cs|>": 50283,
67
+ "<|cy|>": 50297,
68
+ "<|da|>": 50285,
69
+ "<|de|>": 50261,
70
+ "<|el|>": 50281,
71
+ "<|en|>": 50259,
72
+ "<|es|>": 50262,
73
+ "<|et|>": 50307,
74
+ "<|eu|>": 50310,
75
+ "<|fa|>": 50300,
76
+ "<|fi|>": 50277,
77
+ "<|fo|>": 50338,
78
+ "<|fr|>": 50265,
79
+ "<|gl|>": 50319,
80
+ "<|gu|>": 50333,
81
+ "<|haw|>": 50352,
82
+ "<|ha|>": 50354,
83
+ "<|he|>": 50279,
84
+ "<|hi|>": 50276,
85
+ "<|hr|>": 50291,
86
+ "<|ht|>": 50339,
87
+ "<|hu|>": 50286,
88
+ "<|hy|>": 50312,
89
+ "<|id|>": 50275,
90
+ "<|is|>": 50311,
91
+ "<|it|>": 50274,
92
+ "<|ja|>": 50266,
93
+ "<|jw|>": 50356,
94
+ "<|ka|>": 50329,
95
+ "<|kk|>": 50316,
96
+ "<|km|>": 50323,
97
+ "<|kn|>": 50306,
98
+ "<|ko|>": 50264,
99
+ "<|la|>": 50294,
100
+ "<|lb|>": 50345,
101
+ "<|ln|>": 50353,
102
+ "<|lo|>": 50336,
103
+ "<|lt|>": 50293,
104
+ "<|lv|>": 50301,
105
+ "<|mg|>": 50349,
106
+ "<|mi|>": 50295,
107
+ "<|mk|>": 50308,
108
+ "<|ml|>": 50296,
109
+ "<|mn|>": 50314,
110
+ "<|mr|>": 50320,
111
+ "<|ms|>": 50282,
112
+ "<|mt|>": 50343,
113
+ "<|my|>": 50346,
114
+ "<|ne|>": 50313,
115
+ "<|nl|>": 50271,
116
+ "<|nn|>": 50342,
117
+ "<|no|>": 50288,
118
+ "<|oc|>": 50328,
119
+ "<|pa|>": 50321,
120
+ "<|pl|>": 50269,
121
+ "<|ps|>": 50340,
122
+ "<|pt|>": 50267,
123
+ "<|ro|>": 50284,
124
+ "<|ru|>": 50263,
125
+ "<|sa|>": 50344,
126
+ "<|sd|>": 50332,
127
+ "<|si|>": 50322,
128
+ "<|sk|>": 50298,
129
+ "<|sl|>": 50305,
130
+ "<|sn|>": 50324,
131
+ "<|so|>": 50326,
132
+ "<|sq|>": 50317,
133
+ "<|sr|>": 50303,
134
+ "<|su|>": 50357,
135
+ "<|sv|>": 50273,
136
+ "<|sw|>": 50318,
137
+ "<|ta|>": 50287,
138
+ "<|te|>": 50299,
139
+ "<|tg|>": 50331,
140
+ "<|th|>": 50289,
141
+ "<|tk|>": 50341,
142
+ "<|tl|>": 50348,
143
+ "<|tr|>": 50268,
144
+ "<|tt|>": 50351,
145
+ "<|uk|>": 50280,
146
+ "<|ur|>": 50290,
147
+ "<|uz|>": 50337,
148
+ "<|vi|>": 50278,
149
+ "<|yi|>": 50335,
150
+ "<|yo|>": 50325,
151
+ "<|zh|>": 50260
152
+ },
153
+ "language": "chinese",
154
+ "max_initial_timestamp_index": 50,
155
+ "max_length": 448,
156
+ "no_timestamps_token_id": 50363,
157
+ "pad_token_id": 50257,
158
+ "prev_sot_token_id": 50361,
159
+ "return_timestamps": false,
160
+ "suppress_tokens": [
161
+ 1,
162
+ 2,
163
+ 7,
164
+ 8,
165
+ 9,
166
+ 10,
167
+ 14,
168
+ 25,
169
+ 26,
170
+ 27,
171
+ 28,
172
+ 29,
173
+ 31,
174
+ 58,
175
+ 59,
176
+ 60,
177
+ 61,
178
+ 62,
179
+ 63,
180
+ 90,
181
+ 91,
182
+ 92,
183
+ 93,
184
+ 359,
185
+ 503,
186
+ 522,
187
+ 542,
188
+ 873,
189
+ 893,
190
+ 902,
191
+ 918,
192
+ 922,
193
+ 931,
194
+ 1350,
195
+ 1853,
196
+ 1982,
197
+ 2460,
198
+ 2627,
199
+ 3246,
200
+ 3253,
201
+ 3268,
202
+ 3536,
203
+ 3846,
204
+ 3961,
205
+ 4183,
206
+ 4667,
207
+ 6585,
208
+ 6647,
209
+ 7273,
210
+ 9061,
211
+ 9383,
212
+ 10428,
213
+ 10929,
214
+ 11938,
215
+ 12033,
216
+ 12331,
217
+ 12562,
218
+ 13793,
219
+ 14157,
220
+ 14635,
221
+ 15265,
222
+ 15618,
223
+ 16553,
224
+ 16604,
225
+ 18362,
226
+ 18956,
227
+ 20075,
228
+ 21675,
229
+ 22520,
230
+ 26130,
231
+ 26161,
232
+ 26435,
233
+ 28279,
234
+ 29464,
235
+ 31650,
236
+ 32302,
237
+ 32470,
238
+ 36865,
239
+ 42863,
240
+ 47425,
241
+ 49870,
242
+ 50254,
243
+ 50258,
244
+ 50360,
245
+ 50361,
246
+ 50362
247
+ ],
248
+ "task": "transcribe",
249
+ "task_to_id": {
250
+ "transcribe": 50359,
251
+ "translate": 50358
252
+ },
253
+ "transformers_version": "4.51.3"
254
+ }
merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2a55144040a6e7b23d09e36e30316ad1cd153f71fb213853d9e0f9baf0025f96
3
+ size 966995080