Edit model card

ai-light-dance_drums_ft_pretrain_wav2vec2-base

This model is a fine-tuned version of gary109/ai-light-dance_drums_ft_pretrain_wav2vec2-base on the GARY109/AI_LIGHT_DANCE - ONSET-DRUMS dataset. It achieves the following results on the evaluation set:

  • Loss: 1.8991
  • Wer: 0.6046

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0004
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 8
  • total_train_batch_size: 64
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 20
  • num_epochs: 200.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
No log 0.9 8 2.0434 0.6226
0.4739 1.9 16 2.1024 0.6247
0.4693 2.9 24 1.9824 0.6211
0.5139 3.9 32 2.2962 0.6429
0.5081 4.9 40 2.2201 0.6292
0.5081 5.9 48 2.1399 0.6208
0.5785 6.9 56 2.1451 0.6417
0.533 7.9 64 2.1184 0.6330
0.5141 8.9 72 2.0230 0.6342
0.4971 9.9 80 2.2137 0.6381
0.4971 10.9 88 2.1159 0.6253
0.5645 11.9 96 2.0966 0.6247
0.4932 12.9 104 1.9249 0.6223
0.4918 13.9 112 2.0445 0.6235
0.5053 14.9 120 2.1317 0.6304
0.5053 15.9 128 2.0723 0.6256
0.5565 16.9 136 2.1390 0.6402
0.4819 17.9 144 1.9556 0.6321
0.5131 18.9 152 1.9886 0.6333
0.4798 19.9 160 1.9700 0.6259
0.4798 20.9 168 1.9771 0.6295
0.5221 21.9 176 1.9880 0.6235
0.4862 22.9 184 2.0994 0.6298
0.4831 23.9 192 2.0521 0.6205
0.4952 24.9 200 1.9838 0.6064
0.4952 25.9 208 2.0319 0.6103
0.5119 26.9 216 2.0419 0.6160
0.4996 27.9 224 2.0073 0.6178
0.488 28.9 232 2.1740 0.6304
0.4978 29.9 240 2.2731 0.6163
0.4978 30.9 248 2.2420 0.6205
0.5259 31.9 256 2.0561 0.6184
0.47 32.9 264 1.9455 0.6136
0.5132 33.9 272 1.9307 0.6043
0.4972 34.9 280 2.0536 0.6127
0.4972 35.9 288 1.9113 0.6223
0.5147 36.9 296 1.9317 0.6286
0.4914 37.9 304 2.1810 0.6241
0.472 38.9 312 2.1403 0.6160
0.4825 39.9 320 2.1141 0.6094
0.4825 40.9 328 2.2870 0.6031
0.5138 41.9 336 2.1404 0.6181
0.48 42.9 344 2.0243 0.6265
0.4598 43.9 352 2.1117 0.6199
0.474 44.9 360 2.0378 0.6321
0.474 45.9 368 2.1919 0.6211
0.4933 46.9 376 2.3645 0.6109
0.4692 47.9 384 2.1920 0.6076
0.4716 48.9 392 2.3663 0.6034
0.4601 49.9 400 2.2838 0.6280
0.4601 50.9 408 2.0287 0.6148
0.4891 51.9 416 2.1346 0.6130
0.4506 52.9 424 2.1556 0.6181
0.4581 53.9 432 2.0560 0.6229
0.4485 54.9 440 1.9944 0.5971
0.4485 55.9 448 1.9791 0.6097
0.4942 56.9 456 2.1166 0.6070
0.4748 57.9 464 2.0271 0.6124
0.4229 58.9 472 2.0437 0.6229
0.45 59.9 480 2.1012 0.6142
0.45 60.9 488 1.9151 0.6049
0.4936 61.9 496 1.8991 0.6046
0.4602 62.9 504 1.9813 0.6112
0.4626 63.9 512 1.9372 0.6136
0.445 64.9 520 1.9060 0.6154
0.445 65.9 528 1.9574 0.6151
0.4907 66.9 536 2.0947 0.6022
0.4723 67.9 544 2.0061 0.6010
0.4103 68.9 552 1.9557 0.6094
0.4808 69.9 560 2.1042 0.6088
0.4808 70.9 568 2.1360 0.6073
0.4682 71.9 576 2.1290 0.6013
0.4472 72.9 584 1.9454 0.5989
0.4259 73.9 592 2.0937 0.6043
0.4464 74.9 600 2.0822 0.6058
0.4464 75.9 608 2.0128 0.6058
0.4775 76.9 616 1.9744 0.6094
0.4394 77.9 624 1.9992 0.6010
0.418 78.9 632 2.1693 0.5947
0.4384 79.9 640 2.1326 0.5923
0.4384 80.9 648 2.1151 0.5950
0.4971 81.9 656 2.1581 0.5923
0.4176 82.9 664 2.0876 0.6013
0.4312 83.9 672 2.1316 0.5935
0.4408 84.9 680 2.2627 0.5971
0.4408 85.9 688 2.2799 0.6112
0.4678 86.9 696 2.1239 0.5989
0.4288 87.9 704 2.1574 0.5983
0.4157 88.9 712 2.2125 0.5908
0.444 89.9 720 2.0542 0.5986
0.444 90.9 728 2.0899 0.5920
0.4694 91.9 736 2.1122 0.6076
0.4314 92.9 744 2.0634 0.5950
0.4348 93.9 752 2.0333 0.6046
0.4558 94.9 760 2.1188 0.5956
0.4558 95.9 768 2.0606 0.5995
0.461 96.9 776 2.0600 0.5971
0.4258 97.9 784 2.0479 0.6040
0.4395 98.9 792 2.1282 0.6055
0.4282 99.9 800 2.0593 0.6043
0.4282 100.9 808 2.0592 0.5920
0.4623 101.9 816 2.0852 0.5944
0.4392 102.9 824 2.2024 0.5920
0.4308 103.9 832 2.1786 0.5935
0.4375 104.9 840 2.1085 0.5911
0.4375 105.9 848 2.0724 0.5974
0.4501 106.9 856 2.1306 0.5881
0.4273 107.9 864 2.1340 0.5899
0.4234 108.9 872 2.1125 0.5980
0.4289 109.9 880 2.0526 0.6007
0.4289 110.9 888 2.0955 0.5884
0.478 111.9 896 2.1146 0.5872
0.4143 112.9 904 2.2310 0.5899
0.4193 113.9 912 2.2165 0.5899
0.4159 114.9 920 2.1631 0.5941
0.4159 115.9 928 2.1371 0.5938
0.4776 116.9 936 2.0972 0.5935
0.4143 117.9 944 2.1248 0.5917
0.4022 118.9 952 2.1317 0.5956
0.4346 119.9 960 2.1237 0.5992
0.4346 120.9 968 2.0684 0.5935
0.4564 121.9 976 2.0722 0.5947
0.4243 122.9 984 2.1361 0.5884
0.413 123.9 992 2.1207 0.5893
0.4113 124.9 1000 2.0697 0.5837
0.4113 125.9 1008 2.1005 0.5875
0.4426 126.9 1016 2.0822 0.5870
0.4255 127.9 1024 2.0572 0.5959
0.4214 128.9 1032 2.0343 0.5935
0.4042 129.9 1040 2.0282 0.5902
0.4042 130.9 1048 2.0314 0.5846
0.4515 131.9 1056 2.0621 0.5870
0.4138 132.9 1064 2.0704 0.5938
0.4289 133.9 1072 2.0222 0.5896
0.3908 134.9 1080 2.0879 0.5855
0.3908 135.9 1088 2.1068 0.5822
0.4489 136.9 1096 2.0702 0.5837
0.4191 137.9 1104 2.1093 0.5881
0.4149 138.9 1112 2.1046 0.5819
0.4127 139.9 1120 2.1729 0.5777
0.4127 140.9 1128 2.1636 0.5810
0.4449 141.9 1136 2.1515 0.5786
0.3977 142.9 1144 2.1531 0.5774
0.4121 143.9 1152 2.0857 0.5816
0.4363 144.9 1160 2.1372 0.5822
0.4363 145.9 1168 2.1902 0.5828
0.4318 146.9 1176 2.1465 0.5831
0.4112 147.9 1184 2.0697 0.5858
0.4292 148.9 1192 2.0850 0.5837
0.4182 149.9 1200 2.1171 0.5846
0.4182 150.9 1208 2.1020 0.5867
0.4381 151.9 1216 2.1052 0.5849
0.4235 152.9 1224 2.1430 0.5864
0.4173 153.9 1232 2.1131 0.5834
0.3927 154.9 1240 2.1134 0.5846
0.3927 155.9 1248 2.1173 0.5846
0.4492 156.9 1256 2.0772 0.5801
0.4313 157.9 1264 2.0309 0.5861
0.4015 158.9 1272 2.0887 0.5819
0.4268 159.9 1280 2.1812 0.5849
0.4268 160.9 1288 2.1568 0.5881
0.4496 161.9 1296 2.0805 0.5801
0.4121 162.9 1304 2.0461 0.5872
0.401 163.9 1312 2.0377 0.5864
0.4192 164.9 1320 2.0183 0.5872
0.4192 165.9 1328 2.0107 0.5855
0.4466 166.9 1336 2.0528 0.5881
0.3981 167.9 1344 2.0511 0.5878
0.3967 168.9 1352 2.0374 0.5867
0.4072 169.9 1360 2.0554 0.5867
0.4072 170.9 1368 2.0388 0.5858
0.4581 171.9 1376 2.0188 0.5914
0.3937 172.9 1384 1.9999 0.5852
0.4074 173.9 1392 1.9738 0.5840
0.4085 174.9 1400 2.0090 0.5843
0.4085 175.9 1408 1.9990 0.5864
0.4224 176.9 1416 2.0391 0.5852
0.4471 177.9 1424 2.0262 0.5855
0.4233 178.9 1432 2.0621 0.5801
0.409 179.9 1440 2.0486 0.5846
0.409 180.9 1448 2.0508 0.5807
0.4518 181.9 1456 2.0241 0.5887
0.4077 182.9 1464 2.0169 0.5843
0.4197 183.9 1472 2.0014 0.5896
0.4237 184.9 1480 2.0189 0.5843
0.4237 185.9 1488 2.0095 0.5867
0.4394 186.9 1496 1.9993 0.5884
0.4299 187.9 1504 2.0097 0.5899
0.4198 188.9 1512 2.0049 0.5870
0.4116 189.9 1520 1.9899 0.5875
0.4116 190.9 1528 1.9814 0.5881
0.445 191.9 1536 1.9820 0.5887
0.4198 192.9 1544 1.9838 0.5881
0.4065 193.9 1552 1.9849 0.5884
0.3917 194.9 1560 1.9803 0.5867
0.3917 195.9 1568 1.9777 0.5881
0.4239 196.9 1576 1.9752 0.5875
0.4183 197.9 1584 1.9766 0.5872
0.3965 198.9 1592 1.9773 0.5872
0.4144 199.9 1600 1.9781 0.5872

Framework versions

  • Transformers 4.24.0.dev0
  • Pytorch 1.12.1+cu113
  • Datasets 2.6.1
  • Tokenizers 0.13.1
Downloads last month
23
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.