sandernotenbaert commited on
Commit
bdc5271
·
verified ·
1 Parent(s): 70b0b8a

Upload structure model checkpoint at step 6000

Browse files
checkpoint_6000/README.md ADDED
@@ -0,0 +1,46 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: mlx
3
+ pipeline_tag: text-generation
4
+ tags:
5
+ - music
6
+ - midi
7
+ - generation
8
+ - mlx
9
+ - autoregressive
10
+ - structure
11
+ - musiclang
12
+ ---
13
+
14
+ # okai-musiclang-structure v2.0 - Structure Model
15
+
16
+ This is an autoregressive structure generation model for music using MLX.
17
+
18
+ ## Model Details
19
+
20
+ - **Model Type**: Structure Generator (Autoregressive)
21
+ - **Version**: v2.0
22
+ - **Step**: 6000
23
+ - **Architecture**: Transformer with causal language modeling
24
+ - **Vocabulary Size**: 4796
25
+ - **Model Dimension**: 256
26
+ - **Layers**: 6
27
+ - **Max Sequence Length**: 1024
28
+
29
+ ## Training Configuration
30
+
31
+ - **Batch Size**: 16
32
+ - **Learning Rate**: 1e-05
33
+ - **Training Steps**: 6000
34
+
35
+ ## Usage
36
+
37
+ This model generates sequential music structure:
38
+ - Input: Song control tokens (genre, instruments, etc.)
39
+ - Output: Sequential bar structure with chords and tonality
40
+
41
+ Example generation:
42
+ ```
43
+ GENRE__ROCK SUBGENRE__ALTERNATIVE START BAR__1 CHORD_DEGREE__1 TONALITY_DEGREE__1 BAR__2 CHORD_DEGREE__4 TONALITY_DEGREE__5 ... WILL_END
44
+ ```
45
+
46
+ Generated with MLX framework for Apple Silicon.
checkpoint_6000/config.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "model_name": "okai-musiclang-structure",
3
+ "model_version": "v2.0",
4
+ "model_type": "structure",
5
+ "global_step": 6000,
6
+ "architecture": "AutoregressiveTransformer",
7
+ "training_type": "causal_lm",
8
+ "vocab_size": 4796,
9
+ "model_dim": 256,
10
+ "num_heads": 8,
11
+ "num_layers": 6,
12
+ "max_sequence_length": 1024,
13
+ "dropout": 0.1
14
+ }
checkpoint_6000/model.npz ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:702b361e9e7c22ee60d66b968fdc09d957b7441f0341d2c5a019df2a1e361ea6
3
+ size 29839884
checkpoint_6000/training_state.json ADDED
@@ -0,0 +1,320 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "global_step": 6000,
3
+ "epoch": 3,
4
+ "trainer_step": 6000,
5
+ "learning_rate": 9.999999747378752e-06,
6
+ "epoch_losses": [
7
+ 0.8267251253128052,
8
+ 0.7156606912612915,
9
+ 0.860819935798645,
10
+ 0.9418178796768188,
11
+ 0.7954195141792297,
12
+ 0.785536527633667,
13
+ 0.880850076675415,
14
+ 0.9444862604141235,
15
+ 0.710844874382019,
16
+ 0.8115501999855042,
17
+ 0.8385743498802185,
18
+ 0.8324074745178223,
19
+ 0.8686010241508484,
20
+ 0.8397880792617798,
21
+ 0.7364058494567871,
22
+ 0.8601733446121216,
23
+ 0.879347026348114,
24
+ 0.7917970418930054,
25
+ 0.8743105530738831,
26
+ 0.8263768553733826,
27
+ 0.8274832367897034,
28
+ 0.7282205820083618,
29
+ 0.8505825400352478,
30
+ 0.92011958360672,
31
+ 0.830745279788971,
32
+ 0.8573527932167053,
33
+ 0.8543904423713684,
34
+ 0.9390841722488403,
35
+ 0.8852065801620483,
36
+ 0.872724175453186,
37
+ 0.8681007027626038,
38
+ 0.8253645896911621,
39
+ 0.7693959474563599,
40
+ 0.8637677431106567,
41
+ 0.8749911785125732,
42
+ 0.7714352011680603,
43
+ 0.8498124480247498,
44
+ 0.7690132260322571,
45
+ 0.8109665513038635,
46
+ 0.8016248941421509,
47
+ 0.7285829782485962,
48
+ 0.8874986171722412,
49
+ 0.7933679223060608,
50
+ 0.8005619645118713,
51
+ 0.8700001835823059,
52
+ 0.8136159777641296,
53
+ 0.8945200443267822,
54
+ 0.7730787992477417,
55
+ 0.7665235996246338,
56
+ 0.7343435287475586,
57
+ 0.7805270552635193,
58
+ 0.9628711938858032,
59
+ 0.7565982937812805,
60
+ 0.8406789302825928,
61
+ 0.9148065447807312,
62
+ 0.8471393585205078,
63
+ 0.7834373712539673,
64
+ 0.7912096381187439,
65
+ 0.8091537952423096,
66
+ 0.8797505497932434,
67
+ 0.781848669052124,
68
+ 0.8427628874778748,
69
+ 0.7648460865020752,
70
+ 0.8447132706642151,
71
+ 0.8403868079185486,
72
+ 0.7498573064804077,
73
+ 0.7488678097724915,
74
+ 0.8513166308403015,
75
+ 0.7660120725631714,
76
+ 0.7732702493667603,
77
+ 0.7615871429443359,
78
+ 0.8227044939994812,
79
+ 0.8294056057929993,
80
+ 0.8804996609687805,
81
+ 0.8145940899848938,
82
+ 0.7456373572349548,
83
+ 0.7570104002952576,
84
+ 0.8333815336227417,
85
+ 0.9013676643371582,
86
+ 0.7411203980445862,
87
+ 0.8191972374916077,
88
+ 0.7782153487205505,
89
+ 0.8326579928398132,
90
+ 0.845100462436676,
91
+ 0.9998421669006348,
92
+ 0.7848277688026428,
93
+ 0.7959708571434021,
94
+ 0.8005611300468445,
95
+ 0.827416181564331,
96
+ 0.7992665767669678,
97
+ 0.7432759404182434,
98
+ 0.6925281882286072,
99
+ 0.8919035792350769,
100
+ 0.7495708465576172,
101
+ 0.7746514081954956,
102
+ 0.8433645963668823,
103
+ 0.7935962677001953,
104
+ 0.8344244956970215,
105
+ 0.8487056493759155,
106
+ 0.9141316413879395,
107
+ 0.8923897743225098,
108
+ 0.8714606165885925,
109
+ 0.8831012845039368,
110
+ 0.7249584197998047,
111
+ 0.8017855286598206,
112
+ 0.7325617074966431,
113
+ 0.8545896410942078,
114
+ 0.8401230573654175,
115
+ 0.819802463054657,
116
+ 0.8069342970848083,
117
+ 0.7293420433998108,
118
+ 0.8219670057296753,
119
+ 0.8231580257415771,
120
+ 0.7832656502723694,
121
+ 0.8409003019332886,
122
+ 0.7875893712043762,
123
+ 0.7834862470626831,
124
+ 0.8598553538322449,
125
+ 0.7544928193092346,
126
+ 0.7602822184562683,
127
+ 0.8253030180931091,
128
+ 0.7512518763542175,
129
+ 0.6745361089706421,
130
+ 0.8263822793960571,
131
+ 0.7470456957817078,
132
+ 0.7142889499664307,
133
+ 0.875754714012146,
134
+ 0.7587679624557495,
135
+ 0.7668524384498596,
136
+ 0.7340443730354309,
137
+ 0.7819405198097229,
138
+ 0.8259172439575195,
139
+ 0.8802277445793152,
140
+ 0.8219906091690063,
141
+ 0.8531641960144043,
142
+ 0.8079729676246643,
143
+ 0.8338596820831299,
144
+ 0.7942845225334167,
145
+ 0.7828904390335083,
146
+ 0.8068522810935974,
147
+ 0.7682546973228455,
148
+ 0.7650644183158875,
149
+ 0.8084389567375183,
150
+ 0.799694836139679,
151
+ 0.7674142718315125,
152
+ 0.8472267389297485,
153
+ 0.8099306225776672,
154
+ 0.8832398056983948,
155
+ 0.7811346054077148,
156
+ 0.8373596668243408,
157
+ 0.811606228351593,
158
+ 0.7994090914726257,
159
+ 0.854801595211029,
160
+ 0.786896288394928,
161
+ 0.7967766523361206,
162
+ 0.893247663974762,
163
+ 0.8646430969238281,
164
+ 0.7787424325942993,
165
+ 0.8987262845039368,
166
+ 0.8025528788566589,
167
+ 0.7871999740600586,
168
+ 0.8252074718475342,
169
+ 0.7703168392181396,
170
+ 0.8022949695587158,
171
+ 0.8270684480667114,
172
+ 0.7491687536239624,
173
+ 0.8273314237594604,
174
+ 0.8100937604904175,
175
+ 0.8045676946640015,
176
+ 0.8092302083969116,
177
+ 0.8559038043022156,
178
+ 0.81805020570755,
179
+ 0.8238271474838257,
180
+ 0.7470946907997131,
181
+ 0.7456565499305725,
182
+ 0.8375546336174011,
183
+ 0.8250463008880615,
184
+ 0.9221420288085938,
185
+ 0.8561370968818665,
186
+ 0.8031499981880188,
187
+ 0.8433287739753723,
188
+ 0.964592456817627,
189
+ 0.7806392908096313,
190
+ 0.8497879505157471,
191
+ 0.8931498527526855,
192
+ 0.773786187171936,
193
+ 0.8455853462219238,
194
+ 0.8512445688247681,
195
+ 0.7696901559829712,
196
+ 0.7745411396026611,
197
+ 0.9112189412117004,
198
+ 0.862769603729248,
199
+ 0.7580086588859558,
200
+ 0.7507232427597046,
201
+ 0.8742559552192688,
202
+ 0.8068438172340393,
203
+ 0.8243193030357361,
204
+ 0.7685670852661133,
205
+ 0.7798039317131042,
206
+ 0.7855411171913147,
207
+ 0.8399823904037476,
208
+ 0.8017387986183167,
209
+ 0.7240294814109802,
210
+ 0.8370116353034973,
211
+ 0.7078456282615662,
212
+ 0.8576661348342896,
213
+ 0.8158095479011536,
214
+ 0.7758065462112427,
215
+ 0.7223475575447083,
216
+ 0.9219492077827454,
217
+ 0.8386049270629883,
218
+ 0.8989261984825134,
219
+ 0.7553024291992188,
220
+ 0.7741619348526001,
221
+ 0.7765299677848816,
222
+ 0.7559140920639038,
223
+ 0.7753311991691589,
224
+ 0.7087565660476685,
225
+ 0.7816688418388367,
226
+ 0.705898642539978,
227
+ 0.7425345778465271,
228
+ 0.8033889532089233,
229
+ 0.788788378238678,
230
+ 0.7933453917503357,
231
+ 0.8550800681114197,
232
+ 0.740778386592865,
233
+ 0.7846851944923401,
234
+ 0.9378728866577148,
235
+ 0.8104960322380066,
236
+ 0.8187196254730225,
237
+ 0.8880254626274109,
238
+ 0.8362064361572266,
239
+ 0.8159574866294861,
240
+ 0.7271130681037903,
241
+ 0.8182744979858398,
242
+ 0.8668578267097473,
243
+ 0.8170772194862366,
244
+ 0.8015566468238831,
245
+ 0.7983573079109192,
246
+ 0.7816091179847717,
247
+ 0.7623358964920044,
248
+ 0.7934963703155518,
249
+ 0.8512696027755737,
250
+ 0.8001073598861694,
251
+ 0.8691814541816711,
252
+ 0.7467926144599915,
253
+ 0.73629230260849,
254
+ 0.8151296377182007,
255
+ 0.7516558766365051,
256
+ 0.8484403491020203,
257
+ 0.8413815498352051,
258
+ 0.7151666879653931,
259
+ 0.6833842396736145,
260
+ 0.7645551562309265,
261
+ 0.8128257393836975,
262
+ 0.8214841485023499,
263
+ 0.8272411227226257,
264
+ 0.7388677597045898,
265
+ 0.7817277908325195,
266
+ 0.7946875691413879,
267
+ 0.775445818901062,
268
+ 0.7832990288734436,
269
+ 0.7184854745864868,
270
+ 0.8395812511444092,
271
+ 0.825157105922699,
272
+ 0.8496723771095276,
273
+ 0.752990186214447,
274
+ 0.7535010576248169,
275
+ 0.8279983401298523,
276
+ 0.785174548625946,
277
+ 0.8172211647033691,
278
+ 0.8726750016212463,
279
+ 0.8208408951759338,
280
+ 0.7668969035148621,
281
+ 0.8966258764266968,
282
+ 0.7289772629737854,
283
+ 0.704111635684967,
284
+ 0.846718430519104,
285
+ 0.6758118867874146,
286
+ 0.7777753472328186,
287
+ 0.7973424196243286,
288
+ 0.8451882004737854,
289
+ 0.81060391664505,
290
+ 0.7488210797309875,
291
+ 0.7392869591712952,
292
+ 0.9020735025405884,
293
+ 0.7876619696617126,
294
+ 0.8220584392547607,
295
+ 0.8000291585922241,
296
+ 0.7933567762374878,
297
+ 0.790691614151001
298
+ ],
299
+ "training_config": {
300
+ "batch_size": 16,
301
+ "num_epochs": 4,
302
+ "learning_rate": 1e-05,
303
+ "weight_decay": 0.01,
304
+ "warmup_steps": 1000,
305
+ "max_grad_norm": 1.0,
306
+ "eval_steps": 500,
307
+ "save_steps": 500
308
+ },
309
+ "model_config": {
310
+ "vocab_size": 4796,
311
+ "model_dim": 256,
312
+ "num_heads": 8,
313
+ "num_layers": 6,
314
+ "max_sequence_length": 1024,
315
+ "dropout": 0.1
316
+ },
317
+ "model_name": "okai-musiclang-structure",
318
+ "model_version": "v2.0",
319
+ "model_type": "structure"
320
+ }