sandernotenbaert commited on
Commit
d73ff65
·
verified ·
1 Parent(s): ca39f74

Upload structure model checkpoint at step 10500

Browse files
checkpoint_10500/README.md ADDED
@@ -0,0 +1,46 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: mlx
3
+ pipeline_tag: text-generation
4
+ tags:
5
+ - music
6
+ - midi
7
+ - generation
8
+ - mlx
9
+ - autoregressive
10
+ - structure
11
+ - musiclang
12
+ ---
13
+
14
+ # okai-musiclang-structure v2.0 - Structure Model
15
+
16
+ This is an autoregressive structure generation model for music using MLX.
17
+
18
+ ## Model Details
19
+
20
+ - **Model Type**: Structure Generator (Autoregressive)
21
+ - **Version**: v2.0
22
+ - **Step**: 10500
23
+ - **Architecture**: Transformer with causal language modeling
24
+ - **Vocabulary Size**: 4796
25
+ - **Model Dimension**: 256
26
+ - **Layers**: 6
27
+ - **Max Sequence Length**: 1024
28
+
29
+ ## Training Configuration
30
+
31
+ - **Batch Size**: 8
32
+ - **Learning Rate**: 0.0001
33
+ - **Training Steps**: 10500
34
+
35
+ ## Usage
36
+
37
+ This model generates sequential music structure:
38
+ - Input: Song control tokens (genre, instruments, etc.)
39
+ - Output: Sequential bar structure with chords and tonality
40
+
41
+ Example generation:
42
+ ```
43
+ GENRE__ROCK SUBGENRE__ALTERNATIVE START BAR__1 CHORD_DEGREE__1 TONALITY_DEGREE__1 BAR__2 CHORD_DEGREE__4 TONALITY_DEGREE__5 ... WILL_END
44
+ ```
45
+
46
+ Generated with MLX framework for Apple Silicon.
checkpoint_10500/config.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "model_name": "okai-musiclang-structure",
3
+ "model_version": "v2.0",
4
+ "model_type": "structure",
5
+ "global_step": 10500,
6
+ "architecture": "AutoregressiveTransformer",
7
+ "training_type": "causal_lm",
8
+ "vocab_size": 4796,
9
+ "model_dim": 256,
10
+ "num_heads": 8,
11
+ "num_layers": 6,
12
+ "max_sequence_length": 1024,
13
+ "dropout": 0.1
14
+ }
checkpoint_10500/model.npz ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7063dd645212de20abdeb53dfcbf342d63c39792160598b0ea7c926d86f42d77
3
+ size 29839884
checkpoint_10500/training_state.json ADDED
@@ -0,0 +1,436 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "global_step": 10500,
3
+ "epoch": 4,
4
+ "trainer_step": 10500,
5
+ "learning_rate": 9.999999747378752e-05,
6
+ "epoch_losses": [
7
+ 0.6068229675292969,
8
+ 0.6147122979164124,
9
+ 0.5358209609985352,
10
+ 0.5553908348083496,
11
+ 0.5841445922851562,
12
+ 0.5548526644706726,
13
+ 0.5665801167488098,
14
+ 0.6097950339317322,
15
+ 0.6165870428085327,
16
+ 0.5824198126792908,
17
+ 0.5438722372055054,
18
+ 0.6124132871627808,
19
+ 0.5767466425895691,
20
+ 0.6084809899330139,
21
+ 0.5802693963050842,
22
+ 0.5308706164360046,
23
+ 0.6285377144813538,
24
+ 0.5417285561561584,
25
+ 0.6666320562362671,
26
+ 0.5727231502532959,
27
+ 0.5974239706993103,
28
+ 0.5947246551513672,
29
+ 0.5877379179000854,
30
+ 0.5096126794815063,
31
+ 0.5537489056587219,
32
+ 0.5724690556526184,
33
+ 0.7422561049461365,
34
+ 0.6470662951469421,
35
+ 0.5441425442695618,
36
+ 0.5641372799873352,
37
+ 0.608855664730072,
38
+ 0.5526962280273438,
39
+ 0.5300586819648743,
40
+ 0.5176234245300293,
41
+ 0.6177462935447693,
42
+ 0.6158546805381775,
43
+ 0.535477876663208,
44
+ 0.6794624328613281,
45
+ 0.5503556132316589,
46
+ 0.6050283312797546,
47
+ 0.5130244493484497,
48
+ 0.5733151435852051,
49
+ 0.6155235171318054,
50
+ 0.6235335469245911,
51
+ 0.560245931148529,
52
+ 0.6626805067062378,
53
+ 0.5453429222106934,
54
+ 0.5795163512229919,
55
+ 0.5905030369758606,
56
+ 0.6503201127052307,
57
+ 0.6316166520118713,
58
+ 0.5720886588096619,
59
+ 0.584568977355957,
60
+ 0.5681793689727783,
61
+ 0.5336548089981079,
62
+ 0.6025570034980774,
63
+ 0.6300626993179321,
64
+ 0.5928053259849548,
65
+ 0.5384969711303711,
66
+ 0.6163606643676758,
67
+ 0.5741029381752014,
68
+ 0.6004948019981384,
69
+ 0.6174150109291077,
70
+ 0.5923458337783813,
71
+ 0.515004575252533,
72
+ 0.569686770439148,
73
+ 0.5873505473136902,
74
+ 0.6458595991134644,
75
+ 0.5849032402038574,
76
+ 0.6182407736778259,
77
+ 0.6921294927597046,
78
+ 0.5494351983070374,
79
+ 0.6256405711174011,
80
+ 0.6244653463363647,
81
+ 0.5644569993019104,
82
+ 0.5912322402000427,
83
+ 0.5371138453483582,
84
+ 0.5818576216697693,
85
+ 0.5721514821052551,
86
+ 0.5464998483657837,
87
+ 0.5208649039268494,
88
+ 0.5483657121658325,
89
+ 0.5453832745552063,
90
+ 0.5485217571258545,
91
+ 0.5788820385932922,
92
+ 0.6001384258270264,
93
+ 0.5936784148216248,
94
+ 0.5621330738067627,
95
+ 0.4136527180671692,
96
+ 0.6179426908493042,
97
+ 0.590961217880249,
98
+ 0.5501615405082703,
99
+ 0.6022453904151917,
100
+ 0.5899928212165833,
101
+ 0.5472306609153748,
102
+ 0.5625307559967041,
103
+ 0.5832496285438538,
104
+ 0.47777825593948364,
105
+ 0.4924069046974182,
106
+ 0.581865131855011,
107
+ 0.5879999399185181,
108
+ 0.619421124458313,
109
+ 0.5611231327056885,
110
+ 0.5646601915359497,
111
+ 0.5321586728096008,
112
+ 0.5887902975082397,
113
+ 0.546940803527832,
114
+ 0.6712146997451782,
115
+ 0.5428193807601929,
116
+ 0.6019913554191589,
117
+ 0.568330705165863,
118
+ 0.5007303357124329,
119
+ 0.5436994433403015,
120
+ 0.49487069249153137,
121
+ 0.5390335917472839,
122
+ 0.6383445858955383,
123
+ 0.6272746920585632,
124
+ 0.5678398609161377,
125
+ 0.5205780863761902,
126
+ 0.5687636137008667,
127
+ 0.6133678555488586,
128
+ 0.49371853470802307,
129
+ 0.5017759799957275,
130
+ 0.6229140758514404,
131
+ 0.6260181665420532,
132
+ 0.5337000489234924,
133
+ 0.5795989632606506,
134
+ 0.5913246870040894,
135
+ 0.5171838402748108,
136
+ 0.5626271963119507,
137
+ 0.6427653431892395,
138
+ 0.5758833885192871,
139
+ 0.5893572568893433,
140
+ 0.4847794771194458,
141
+ 0.614666759967804,
142
+ 0.62349933385849,
143
+ 0.49999842047691345,
144
+ 0.5976787805557251,
145
+ 0.5415936708450317,
146
+ 0.5505051016807556,
147
+ 0.5510467886924744,
148
+ 0.5132595300674438,
149
+ 0.5849050283432007,
150
+ 0.5921343564987183,
151
+ 0.6119957566261292,
152
+ 0.5744150876998901,
153
+ 0.5088694095611572,
154
+ 0.6639502644538879,
155
+ 0.5765758752822876,
156
+ 0.6628914475440979,
157
+ 0.5988654494285583,
158
+ 0.6030144691467285,
159
+ 0.5115908980369568,
160
+ 0.7090424299240112,
161
+ 0.6015691161155701,
162
+ 0.5375097990036011,
163
+ 0.6107122898101807,
164
+ 0.6100493669509888,
165
+ 0.6436681151390076,
166
+ 0.5412232875823975,
167
+ 0.4883086681365967,
168
+ 0.5939430594444275,
169
+ 0.629514217376709,
170
+ 0.5516510009765625,
171
+ 0.5875236988067627,
172
+ 0.6478635668754578,
173
+ 0.5669489502906799,
174
+ 0.5700483918190002,
175
+ 0.6022571325302124,
176
+ 0.5821655988693237,
177
+ 0.6028453707695007,
178
+ 0.6684448719024658,
179
+ 0.6156103610992432,
180
+ 0.5626107454299927,
181
+ 0.5960555076599121,
182
+ 0.6530018448829651,
183
+ 0.5708699822425842,
184
+ 0.6313037276268005,
185
+ 0.5388955473899841,
186
+ 0.5975350737571716,
187
+ 0.5618932247161865,
188
+ 0.4922546446323395,
189
+ 0.5383130311965942,
190
+ 0.5920611023902893,
191
+ 0.5584353804588318,
192
+ 0.5603187680244446,
193
+ 0.6247968673706055,
194
+ 0.5053911209106445,
195
+ 0.5876448750495911,
196
+ 0.5628833770751953,
197
+ 0.543531060218811,
198
+ 0.6848689913749695,
199
+ 0.5899630784988403,
200
+ 0.606182873249054,
201
+ 0.6237930059432983,
202
+ 0.5694372057914734,
203
+ 0.5375161170959473,
204
+ 0.4834044873714447,
205
+ 0.7002958655357361,
206
+ 0.5785776376724243,
207
+ 0.5222181677818298,
208
+ 0.6306176781654358,
209
+ 0.6093822121620178,
210
+ 0.5618623495101929,
211
+ 0.5354885458946228,
212
+ 0.4929267466068268,
213
+ 0.514656126499176,
214
+ 0.5422120690345764,
215
+ 0.5069894790649414,
216
+ 0.5951231718063354,
217
+ 0.6320244073867798,
218
+ 0.6510388851165771,
219
+ 0.5482482314109802,
220
+ 0.5626425743103027,
221
+ 0.5791078805923462,
222
+ 0.5709448456764221,
223
+ 0.575049102306366,
224
+ 0.5117776393890381,
225
+ 0.5328512191772461,
226
+ 0.500633716583252,
227
+ 0.6518096327781677,
228
+ 0.6012002229690552,
229
+ 0.5673923492431641,
230
+ 0.5171241164207458,
231
+ 0.6221208572387695,
232
+ 0.6004402041435242,
233
+ 0.5773072242736816,
234
+ 0.6082717776298523,
235
+ 0.516526997089386,
236
+ 0.5737775564193726,
237
+ 0.5515335202217102,
238
+ 0.5868050456047058,
239
+ 0.5342029333114624,
240
+ 0.6061544418334961,
241
+ 0.6180281639099121,
242
+ 0.5535492300987244,
243
+ 0.5539149045944214,
244
+ 0.6104509830474854,
245
+ 0.6161192655563354,
246
+ 0.5675837397575378,
247
+ 0.5239880084991455,
248
+ 0.6546217203140259,
249
+ 0.5809990763664246,
250
+ 0.5967749357223511,
251
+ 0.6391072869300842,
252
+ 0.6709733605384827,
253
+ 0.49622032046318054,
254
+ 0.6122916340827942,
255
+ 0.6360995173454285,
256
+ 0.5916529893875122,
257
+ 0.6379596590995789,
258
+ 0.5477548241615295,
259
+ 0.5788457989692688,
260
+ 0.6315295100212097,
261
+ 0.5536654591560364,
262
+ 0.5727817416191101,
263
+ 0.5177866816520691,
264
+ 0.5929723978042603,
265
+ 0.6029887199401855,
266
+ 0.5575534701347351,
267
+ 0.6782206892967224,
268
+ 0.6720032691955566,
269
+ 0.5148356556892395,
270
+ 0.5456855297088623,
271
+ 0.5119496583938599,
272
+ 0.5541788935661316,
273
+ 0.5986124277114868,
274
+ 0.6209011077880859,
275
+ 0.6370325088500977,
276
+ 0.5069963335990906,
277
+ 0.5966598987579346,
278
+ 0.5716536045074463,
279
+ 0.5650112628936768,
280
+ 0.5016883015632629,
281
+ 0.6148310899734497,
282
+ 0.5626320838928223,
283
+ 0.61522376537323,
284
+ 0.5916692018508911,
285
+ 0.5834066271781921,
286
+ 0.6389506459236145,
287
+ 0.517223596572876,
288
+ 0.5727995038032532,
289
+ 0.5091632008552551,
290
+ 0.6439189910888672,
291
+ 0.5508335828781128,
292
+ 0.5776335000991821,
293
+ 0.6683233380317688,
294
+ 0.5299586653709412,
295
+ 0.6527262330055237,
296
+ 0.5659148097038269,
297
+ 0.5672585368156433,
298
+ 0.5889172554016113,
299
+ 0.6158193349838257,
300
+ 0.5959590673446655,
301
+ 0.5557019114494324,
302
+ 0.631460428237915,
303
+ 0.6409713625907898,
304
+ 0.5368033647537231,
305
+ 0.5725458860397339,
306
+ 0.6339694857597351,
307
+ 0.6118927597999573,
308
+ 0.5586380362510681,
309
+ 0.58585125207901,
310
+ 0.6018088459968567,
311
+ 0.5556765198707581,
312
+ 0.5060271620750427,
313
+ 0.5764479041099548,
314
+ 0.6187633872032166,
315
+ 0.5457466244697571,
316
+ 0.5537747740745544,
317
+ 0.5490426421165466,
318
+ 0.5918760895729065,
319
+ 0.5713350772857666,
320
+ 0.5308899283409119,
321
+ 0.5995064973831177,
322
+ 0.6761916875839233,
323
+ 0.7229323387145996,
324
+ 0.560222327709198,
325
+ 0.5976433753967285,
326
+ 0.5336561799049377,
327
+ 0.5496469736099243,
328
+ 0.5488177537918091,
329
+ 0.5486608147621155,
330
+ 0.5080862045288086,
331
+ 0.5767183303833008,
332
+ 0.5873817801475525,
333
+ 0.65555340051651,
334
+ 0.5260395407676697,
335
+ 0.6220614314079285,
336
+ 0.5629678964614868,
337
+ 0.5108757019042969,
338
+ 0.5894508957862854,
339
+ 0.5952895283699036,
340
+ 0.6248810291290283,
341
+ 0.6370676159858704,
342
+ 0.5992563962936401,
343
+ 0.6197344064712524,
344
+ 0.49486052989959717,
345
+ 0.5667328834533691,
346
+ 0.5910235643386841,
347
+ 0.6132674813270569,
348
+ 0.595587432384491,
349
+ 0.5661947727203369,
350
+ 0.5571675896644592,
351
+ 0.5393549203872681,
352
+ 0.5079856514930725,
353
+ 0.5690367817878723,
354
+ 0.5852253437042236,
355
+ 0.6278046369552612,
356
+ 0.537283182144165,
357
+ 0.6932445764541626,
358
+ 0.5507816672325134,
359
+ 0.5561645030975342,
360
+ 0.5724679231643677,
361
+ 0.5460627675056458,
362
+ 0.5554937720298767,
363
+ 0.5946077108383179,
364
+ 0.6075481176376343,
365
+ 0.5754949450492859,
366
+ 0.5852423310279846,
367
+ 0.568857729434967,
368
+ 0.564369261264801,
369
+ 0.6228911280632019,
370
+ 0.49017900228500366,
371
+ 0.5550848245620728,
372
+ 0.49874427914619446,
373
+ 0.5637691020965576,
374
+ 0.6147509217262268,
375
+ 0.6008241772651672,
376
+ 0.5274174213409424,
377
+ 0.505895733833313,
378
+ 0.5585087537765503,
379
+ 0.6596308350563049,
380
+ 0.5464411973953247,
381
+ 0.6201469302177429,
382
+ 0.5577707886695862,
383
+ 0.6074709296226501,
384
+ 0.5436930060386658,
385
+ 0.5322259664535522,
386
+ 0.5605716109275818,
387
+ 0.6187253594398499,
388
+ 0.5538547039031982,
389
+ 0.614111065864563,
390
+ 0.5404108762741089,
391
+ 0.5271131992340088,
392
+ 0.5652844309806824,
393
+ 0.6292334198951721,
394
+ 0.5739777684211731,
395
+ 0.5551609396934509,
396
+ 0.6132850646972656,
397
+ 0.5656164884567261,
398
+ 0.6357395648956299,
399
+ 0.5830930471420288,
400
+ 0.5598461031913757,
401
+ 0.5753377079963684,
402
+ 0.5994662046432495,
403
+ 0.6626417636871338,
404
+ 0.5877900719642639,
405
+ 0.522404134273529,
406
+ 0.5639075636863708,
407
+ 0.6458974480628967,
408
+ 0.5817210674285889,
409
+ 0.5111574530601501,
410
+ 0.5404403805732727,
411
+ 0.5723444223403931,
412
+ 0.6585403680801392,
413
+ 0.6164023876190186
414
+ ],
415
+ "training_config": {
416
+ "batch_size": 8,
417
+ "num_epochs": 6,
418
+ "learning_rate": 0.0001,
419
+ "weight_decay": 0.01,
420
+ "warmup_steps": 1000,
421
+ "max_grad_norm": 1.0,
422
+ "eval_steps": 500,
423
+ "save_steps": 500
424
+ },
425
+ "model_config": {
426
+ "vocab_size": 4796,
427
+ "model_dim": 256,
428
+ "num_heads": 8,
429
+ "num_layers": 6,
430
+ "max_sequence_length": 1024,
431
+ "dropout": 0.1
432
+ },
433
+ "model_name": "okai-musiclang-structure",
434
+ "model_version": "v2.0",
435
+ "model_type": "structure"
436
+ }