sandernotenbaert commited on
Commit
f624222
·
verified ·
1 Parent(s): a360a5a

Upload structure model checkpoint at step 9500

Browse files
checkpoint_9500/README.md ADDED
@@ -0,0 +1,46 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: mlx
3
+ pipeline_tag: text-generation
4
+ tags:
5
+ - music
6
+ - midi
7
+ - generation
8
+ - mlx
9
+ - autoregressive
10
+ - structure
11
+ - musiclang
12
+ ---
13
+
14
+ # okai-musiclang-structure v2.0 - Structure Model
15
+
16
+ This is an autoregressive structure generation model for music using MLX.
17
+
18
+ ## Model Details
19
+
20
+ - **Model Type**: Structure Generator (Autoregressive)
21
+ - **Version**: v2.0
22
+ - **Step**: 9500
23
+ - **Architecture**: Transformer with causal language modeling
24
+ - **Vocabulary Size**: 4796
25
+ - **Model Dimension**: 256
26
+ - **Layers**: 6
27
+ - **Max Sequence Length**: 1024
28
+
29
+ ## Training Configuration
30
+
31
+ - **Batch Size**: 8
32
+ - **Learning Rate**: 0.0001
33
+ - **Training Steps**: 9500
34
+
35
+ ## Usage
36
+
37
+ This model generates sequential music structure:
38
+ - Input: Song control tokens (genre, instruments, etc.)
39
+ - Output: Sequential bar structure with chords and tonality
40
+
41
+ Example generation:
42
+ ```
43
+ GENRE__ROCK SUBGENRE__ALTERNATIVE START BAR__1 CHORD_DEGREE__1 TONALITY_DEGREE__1 BAR__2 CHORD_DEGREE__4 TONALITY_DEGREE__5 ... WILL_END
44
+ ```
45
+
46
+ Generated with MLX framework for Apple Silicon.
checkpoint_9500/config.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "model_name": "okai-musiclang-structure",
3
+ "model_version": "v2.0",
4
+ "model_type": "structure",
5
+ "global_step": 9500,
6
+ "architecture": "AutoregressiveTransformer",
7
+ "training_type": "causal_lm",
8
+ "vocab_size": 4796,
9
+ "model_dim": 256,
10
+ "num_heads": 8,
11
+ "num_layers": 6,
12
+ "max_sequence_length": 1024,
13
+ "dropout": 0.1
14
+ }
checkpoint_9500/model.npz ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:678e5cec0102544152fcae716e0bbfdfb8e3cabc26e64ebdf1440a0cb706f7d3
3
+ size 29839884
checkpoint_9500/training_state.json ADDED
@@ -0,0 +1,529 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "global_step": 9500,
3
+ "epoch": 3,
4
+ "trainer_step": 9500,
5
+ "learning_rate": 9.999999747378752e-05,
6
+ "epoch_losses": [
7
+ 0.6773134469985962,
8
+ 1.578927755355835,
9
+ 1.4847643375396729,
10
+ 1.2828130722045898,
11
+ 1.1240354776382446,
12
+ 1.013491153717041,
13
+ 0.9770295023918152,
14
+ 0.8010304570198059,
15
+ 0.7834916710853577,
16
+ 1.006479263305664,
17
+ 0.9051234126091003,
18
+ 0.8390260338783264,
19
+ 1.0087395906448364,
20
+ 0.8445046544075012,
21
+ 0.8630063533782959,
22
+ 0.920177698135376,
23
+ 0.9347355365753174,
24
+ 0.8592101335525513,
25
+ 0.7463024258613586,
26
+ 0.8444000482559204,
27
+ 0.7247169017791748,
28
+ 0.8622797727584839,
29
+ 0.9072035551071167,
30
+ 0.8474545478820801,
31
+ 0.7374237775802612,
32
+ 0.7109655141830444,
33
+ 0.681896448135376,
34
+ 0.8218080401420593,
35
+ 0.6939653158187866,
36
+ 0.8056961297988892,
37
+ 0.792138934135437,
38
+ 0.8885697722434998,
39
+ 0.8424720168113708,
40
+ 0.737557053565979,
41
+ 0.7811938524246216,
42
+ 0.709351122379303,
43
+ 0.6694256067276001,
44
+ 0.7591610550880432,
45
+ 0.831205427646637,
46
+ 0.7744646668434143,
47
+ 0.8185214400291443,
48
+ 0.7883873581886292,
49
+ 0.9435914158821106,
50
+ 0.6618115901947021,
51
+ 0.7260267734527588,
52
+ 0.6682060956954956,
53
+ 0.6929033398628235,
54
+ 0.6771573424339294,
55
+ 0.7958671450614929,
56
+ 0.8363136649131775,
57
+ 0.8384776711463928,
58
+ 0.7499340176582336,
59
+ 0.7600743770599365,
60
+ 0.7190064787864685,
61
+ 0.710116446018219,
62
+ 0.6240118741989136,
63
+ 0.7444449067115784,
64
+ 0.7009372711181641,
65
+ 0.6914046406745911,
66
+ 0.7078356742858887,
67
+ 0.7198120951652527,
68
+ 0.7548741698265076,
69
+ 0.7075045704841614,
70
+ 0.6468173861503601,
71
+ 0.7135937809944153,
72
+ 0.6104033589363098,
73
+ 0.6349422931671143,
74
+ 0.7819088101387024,
75
+ 0.7685704827308655,
76
+ 0.6503170132637024,
77
+ 0.8457121253013611,
78
+ 0.6673004627227783,
79
+ 0.7071408033370972,
80
+ 0.9045859575271606,
81
+ 0.7331894636154175,
82
+ 0.7964046001434326,
83
+ 0.7071393132209778,
84
+ 0.7714788317680359,
85
+ 0.7315903306007385,
86
+ 0.730576753616333,
87
+ 0.583000898361206,
88
+ 0.6419753432273865,
89
+ 0.698472261428833,
90
+ 0.7321234345436096,
91
+ 0.8402087688446045,
92
+ 0.7465482354164124,
93
+ 0.7055824398994446,
94
+ 0.6263769268989563,
95
+ 0.6651383638381958,
96
+ 0.6063735485076904,
97
+ 0.7015822529792786,
98
+ 0.6749475598335266,
99
+ 0.6808756589889526,
100
+ 0.6978121399879456,
101
+ 0.723631739616394,
102
+ 0.7400171160697937,
103
+ 0.6401695013046265,
104
+ 0.734099805355072,
105
+ 0.73606938123703,
106
+ 0.7177640199661255,
107
+ 0.5971183180809021,
108
+ 0.5770699381828308,
109
+ 0.7874430418014526,
110
+ 0.7243313193321228,
111
+ 0.7453527450561523,
112
+ 0.6576721668243408,
113
+ 0.586551308631897,
114
+ 0.604226291179657,
115
+ 0.7859461307525635,
116
+ 0.7151143550872803,
117
+ 0.7220450639724731,
118
+ 0.677390992641449,
119
+ 0.7558555603027344,
120
+ 0.6798003911972046,
121
+ 0.7116447687149048,
122
+ 0.7652080059051514,
123
+ 0.5564831495285034,
124
+ 0.6720426678657532,
125
+ 0.7096203565597534,
126
+ 0.8285616040229797,
127
+ 0.6409652829170227,
128
+ 0.7799509167671204,
129
+ 0.6196882724761963,
130
+ 0.8066487908363342,
131
+ 0.6808728575706482,
132
+ 0.6067438721656799,
133
+ 0.7956268787384033,
134
+ 0.6376709342002869,
135
+ 0.6311450004577637,
136
+ 0.6560420393943787,
137
+ 0.6679294109344482,
138
+ 0.7106379866600037,
139
+ 0.6901361346244812,
140
+ 0.6665379405021667,
141
+ 0.7299391031265259,
142
+ 0.5952994227409363,
143
+ 0.6159335374832153,
144
+ 0.6723544001579285,
145
+ 0.7637686133384705,
146
+ 0.637432873249054,
147
+ 0.6190415620803833,
148
+ 0.6611568927764893,
149
+ 0.6753519773483276,
150
+ 0.7055612206459045,
151
+ 0.6993380188941956,
152
+ 0.6499632596969604,
153
+ 0.7273175120353699,
154
+ 0.7831818461418152,
155
+ 0.6155182719230652,
156
+ 0.7239189743995667,
157
+ 0.6422118544578552,
158
+ 0.741333544254303,
159
+ 0.666513204574585,
160
+ 0.5563932061195374,
161
+ 0.5733732581138611,
162
+ 0.718919575214386,
163
+ 0.6605110168457031,
164
+ 0.6882463097572327,
165
+ 0.7416937947273254,
166
+ 0.6396051049232483,
167
+ 0.6981858015060425,
168
+ 0.7583643198013306,
169
+ 0.7036352157592773,
170
+ 0.7000483274459839,
171
+ 0.6477225422859192,
172
+ 0.6219835877418518,
173
+ 0.7240121960639954,
174
+ 0.6029985547065735,
175
+ 0.6381210088729858,
176
+ 0.6150124073028564,
177
+ 0.6158690452575684,
178
+ 0.5497281551361084,
179
+ 0.6962404847145081,
180
+ 0.6810056567192078,
181
+ 0.7008506655693054,
182
+ 0.5994274020195007,
183
+ 0.5819798111915588,
184
+ 0.6859465837478638,
185
+ 0.6061791777610779,
186
+ 0.6593834757804871,
187
+ 0.6439335346221924,
188
+ 0.6715618968009949,
189
+ 0.5726573467254639,
190
+ 0.5963189601898193,
191
+ 0.5798898935317993,
192
+ 0.7323294878005981,
193
+ 0.6865823864936829,
194
+ 0.6924436688423157,
195
+ 0.6127495765686035,
196
+ 0.6518658399581909,
197
+ 0.6629366278648376,
198
+ 0.6219911575317383,
199
+ 0.6598042249679565,
200
+ 0.6862918734550476,
201
+ 0.650632917881012,
202
+ 0.84084552526474,
203
+ 0.782798707485199,
204
+ 0.6546173691749573,
205
+ 0.6198826432228088,
206
+ 0.7008614540100098,
207
+ 0.6837297677993774,
208
+ 0.62325519323349,
209
+ 0.621605634689331,
210
+ 0.5967110991477966,
211
+ 0.8502236008644104,
212
+ 0.6759672164916992,
213
+ 0.7229551672935486,
214
+ 0.6522384285926819,
215
+ 0.64774090051651,
216
+ 0.5904353857040405,
217
+ 0.7439717054367065,
218
+ 0.6065196990966797,
219
+ 0.6686391234397888,
220
+ 0.733456015586853,
221
+ 0.6781181693077087,
222
+ 0.8116433620452881,
223
+ 0.5720877647399902,
224
+ 0.6479737162590027,
225
+ 0.7558318376541138,
226
+ 0.6991015672683716,
227
+ 0.7158660888671875,
228
+ 0.7423058152198792,
229
+ 0.6979280114173889,
230
+ 0.5821377635002136,
231
+ 0.5791804194450378,
232
+ 0.6888729929924011,
233
+ 0.6471275091171265,
234
+ 0.6007723212242126,
235
+ 0.6458422541618347,
236
+ 0.5661718249320984,
237
+ 0.7068699598312378,
238
+ 0.6631348729133606,
239
+ 0.5931293964385986,
240
+ 0.701033353805542,
241
+ 0.719917356967926,
242
+ 0.6561917066574097,
243
+ 0.6674966216087341,
244
+ 0.7428232431411743,
245
+ 0.5837993025779724,
246
+ 0.5758156776428223,
247
+ 0.5710763931274414,
248
+ 0.69753497838974,
249
+ 0.6517305374145508,
250
+ 0.5935280323028564,
251
+ 0.7331358194351196,
252
+ 0.6282294392585754,
253
+ 0.6767293214797974,
254
+ 0.5661870837211609,
255
+ 0.6255740523338318,
256
+ 0.6543374061584473,
257
+ 0.5470375418663025,
258
+ 0.6049108505249023,
259
+ 0.5972854495048523,
260
+ 0.6578058004379272,
261
+ 0.6542236804962158,
262
+ 0.6630852222442627,
263
+ 0.6899949312210083,
264
+ 0.615245521068573,
265
+ 0.5653952956199646,
266
+ 0.7239052653312683,
267
+ 0.6567297577857971,
268
+ 0.5301370024681091,
269
+ 0.6231946349143982,
270
+ 0.6864354014396667,
271
+ 0.6444852948188782,
272
+ 0.5984693765640259,
273
+ 0.6691752076148987,
274
+ 0.6771892309188843,
275
+ 0.6393926739692688,
276
+ 0.5505920648574829,
277
+ 0.6260342001914978,
278
+ 0.6211017966270447,
279
+ 0.588700532913208,
280
+ 0.6229467988014221,
281
+ 0.6726985573768616,
282
+ 0.7630812525749207,
283
+ 0.6476273536682129,
284
+ 0.651519775390625,
285
+ 0.6531612277030945,
286
+ 0.6063274145126343,
287
+ 0.5636801719665527,
288
+ 0.640821635723114,
289
+ 0.6389163732528687,
290
+ 0.6221607327461243,
291
+ 0.5702659487724304,
292
+ 0.6197552680969238,
293
+ 0.6191220283508301,
294
+ 0.6071435213088989,
295
+ 0.5472128987312317,
296
+ 0.6064696311950684,
297
+ 0.6823537349700928,
298
+ 0.5768254399299622,
299
+ 0.662284791469574,
300
+ 0.5700265765190125,
301
+ 0.6639499068260193,
302
+ 0.6379228830337524,
303
+ 0.650199830532074,
304
+ 0.5963233709335327,
305
+ 0.7157578468322754,
306
+ 0.6593167185783386,
307
+ 0.6377428770065308,
308
+ 0.5815854668617249,
309
+ 0.6471275091171265,
310
+ 0.6874045133590698,
311
+ 0.6959004402160645,
312
+ 0.644058346748352,
313
+ 0.5719549655914307,
314
+ 0.6862888336181641,
315
+ 0.618023693561554,
316
+ 0.569053053855896,
317
+ 0.7293602824211121,
318
+ 0.6058862209320068,
319
+ 0.5935335159301758,
320
+ 0.5907243490219116,
321
+ 0.6947252750396729,
322
+ 0.6774523258209229,
323
+ 0.5775232911109924,
324
+ 0.6719413995742798,
325
+ 0.5609920620918274,
326
+ 0.621250569820404,
327
+ 0.582817792892456,
328
+ 0.5910060405731201,
329
+ 0.6648291945457458,
330
+ 0.7149035334587097,
331
+ 0.6498753428459167,
332
+ 0.5759541988372803,
333
+ 0.530289351940155,
334
+ 0.5688939690589905,
335
+ 0.6314664483070374,
336
+ 0.6668184399604797,
337
+ 0.6384737491607666,
338
+ 0.576710045337677,
339
+ 0.6804152131080627,
340
+ 0.6422805786132812,
341
+ 0.6479368209838867,
342
+ 0.6427617073059082,
343
+ 0.597162127494812,
344
+ 0.6780084371566772,
345
+ 0.6432888507843018,
346
+ 0.6352243423461914,
347
+ 0.6341366767883301,
348
+ 0.6038578152656555,
349
+ 0.6792298555374146,
350
+ 0.5951770544052124,
351
+ 0.6421379446983337,
352
+ 0.6426675915718079,
353
+ 0.6344843506813049,
354
+ 0.7225738167762756,
355
+ 0.7490536570549011,
356
+ 0.6521580815315247,
357
+ 0.6312218308448792,
358
+ 0.5982030630111694,
359
+ 0.7109475135803223,
360
+ 0.6941408514976501,
361
+ 0.6129875779151917,
362
+ 0.640348494052887,
363
+ 0.6208691000938416,
364
+ 0.5432212352752686,
365
+ 0.6325656175613403,
366
+ 0.7440668940544128,
367
+ 0.6143329739570618,
368
+ 0.6241893172264099,
369
+ 0.5971760153770447,
370
+ 0.5734198689460754,
371
+ 0.6907607913017273,
372
+ 0.6044325232505798,
373
+ 0.6585653424263,
374
+ 0.6311697363853455,
375
+ 0.6696928143501282,
376
+ 0.644725501537323,
377
+ 0.6279332637786865,
378
+ 0.5993947982788086,
379
+ 0.620881199836731,
380
+ 0.6270554065704346,
381
+ 0.6007893681526184,
382
+ 0.6468846797943115,
383
+ 0.6472495794296265,
384
+ 0.6218734383583069,
385
+ 0.6191913485527039,
386
+ 0.6121871471405029,
387
+ 0.667190670967102,
388
+ 0.5914902687072754,
389
+ 0.5804444551467896,
390
+ 0.5864625573158264,
391
+ 0.717406153678894,
392
+ 0.6891216039657593,
393
+ 0.661761999130249,
394
+ 0.6346533298492432,
395
+ 0.5368253588676453,
396
+ 0.6610974073410034,
397
+ 0.6501433253288269,
398
+ 0.5615069270133972,
399
+ 0.7417892217636108,
400
+ 0.6090474128723145,
401
+ 0.5593637228012085,
402
+ 0.6524268984794617,
403
+ 0.6354401111602783,
404
+ 0.5545029044151306,
405
+ 0.6492282748222351,
406
+ 0.6251735687255859,
407
+ 0.6369456052780151,
408
+ 0.5801509618759155,
409
+ 0.6664572954177856,
410
+ 0.5858277678489685,
411
+ 0.7278642654418945,
412
+ 0.6117011904716492,
413
+ 0.6930475831031799,
414
+ 0.6249889135360718,
415
+ 0.6094170212745667,
416
+ 0.597905158996582,
417
+ 0.6319848299026489,
418
+ 0.640795111656189,
419
+ 0.703569233417511,
420
+ 0.5871070623397827,
421
+ 0.6020278930664062,
422
+ 0.6596336364746094,
423
+ 0.5836296677589417,
424
+ 0.6171932816505432,
425
+ 0.7675454020500183,
426
+ 0.6672639846801758,
427
+ 0.567331850528717,
428
+ 0.5369137525558472,
429
+ 0.6262825727462769,
430
+ 0.6259591579437256,
431
+ 0.6014063358306885,
432
+ 0.5736784338951111,
433
+ 0.5547645688056946,
434
+ 0.7052000761032104,
435
+ 0.6274820566177368,
436
+ 0.6902376413345337,
437
+ 0.727375328540802,
438
+ 0.5825812816619873,
439
+ 0.5824758410453796,
440
+ 0.614475667476654,
441
+ 0.6145949363708496,
442
+ 0.7347274422645569,
443
+ 0.6046358942985535,
444
+ 0.598690927028656,
445
+ 0.5932267904281616,
446
+ 0.6156849265098572,
447
+ 0.632474422454834,
448
+ 0.6400634050369263,
449
+ 0.6038522720336914,
450
+ 0.5702252388000488,
451
+ 0.6255091428756714,
452
+ 0.5860032439231873,
453
+ 0.6064445972442627,
454
+ 0.5570387840270996,
455
+ 0.6616458892822266,
456
+ 0.5858985185623169,
457
+ 0.6105069518089294,
458
+ 0.5896003246307373,
459
+ 0.6204602122306824,
460
+ 0.6446784734725952,
461
+ 0.6313497424125671,
462
+ 0.6719675064086914,
463
+ 0.5302047729492188,
464
+ 0.6193255186080933,
465
+ 0.6931390762329102,
466
+ 0.5984569191932678,
467
+ 0.661103367805481,
468
+ 0.5169515013694763,
469
+ 0.5580878853797913,
470
+ 0.677649736404419,
471
+ 0.626254677772522,
472
+ 0.6104887127876282,
473
+ 0.6773074269294739,
474
+ 0.620963990688324,
475
+ 0.5942549109458923,
476
+ 0.657702624797821,
477
+ 0.58096843957901,
478
+ 0.6145570874214172,
479
+ 0.6552225947380066,
480
+ 0.6108534932136536,
481
+ 0.6015941500663757,
482
+ 0.5403030514717102,
483
+ 0.6750898361206055,
484
+ 0.6359735727310181,
485
+ 0.635654628276825,
486
+ 0.6127069592475891,
487
+ 0.643945574760437,
488
+ 0.6919783353805542,
489
+ 0.6990990042686462,
490
+ 0.7458180785179138,
491
+ 0.5650256872177124,
492
+ 0.6995920538902283,
493
+ 0.6542274355888367,
494
+ 0.8081838488578796,
495
+ 0.5879368782043457,
496
+ 0.6548652052879333,
497
+ 0.6343247294425964,
498
+ 0.6141842007637024,
499
+ 0.6773292422294617,
500
+ 0.584144115447998,
501
+ 0.6303835511207581,
502
+ 0.5959101319313049,
503
+ 0.641253650188446,
504
+ 0.5734899044036865,
505
+ 0.6166971921920776,
506
+ 0.5998880863189697
507
+ ],
508
+ "training_config": {
509
+ "batch_size": 8,
510
+ "num_epochs": 6,
511
+ "learning_rate": 0.0001,
512
+ "weight_decay": 0.01,
513
+ "warmup_steps": 1000,
514
+ "max_grad_norm": 1.0,
515
+ "eval_steps": 500,
516
+ "save_steps": 500
517
+ },
518
+ "model_config": {
519
+ "vocab_size": 4796,
520
+ "model_dim": 256,
521
+ "num_heads": 8,
522
+ "num_layers": 6,
523
+ "max_sequence_length": 1024,
524
+ "dropout": 0.1
525
+ },
526
+ "model_name": "okai-musiclang-structure",
527
+ "model_version": "v2.0",
528
+ "model_type": "structure"
529
+ }