wav2vec2-large-mms-1b-evenki-colab

This model is a fine-tuned version of facebook/mms-1b-all on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7563
  • Wer: 0.7135

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 4
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
10.0574 0.0308 100 3.4587 1.0
2.6276 0.0615 200 1.7165 0.9800
1.2219 0.0923 300 1.2281 0.9274
1.8242 0.1230 400 1.2651 0.9368
1.1924 0.1538 500 1.1186 0.9025
1.6112 0.1846 600 1.1522 0.9098
1.0251 0.2153 700 1.0574 0.8868
1.5575 0.2461 800 1.1940 0.9190
1.0224 0.2768 900 1.0436 0.8864
1.565 0.3076 1000 1.2350 0.9180
0.9966 0.3384 1100 1.0639 0.8853
1.5941 0.3691 1200 1.2018 0.9266
1.0152 0.3999 1300 1.0436 0.8789
1.5956 0.4306 1400 1.0546 0.8680
0.9592 0.4614 1500 1.0166 0.8753
1.5209 0.4922 1600 0.9918 0.8717
0.9647 0.5229 1700 0.9670 0.8561
1.5689 0.5537 1800 1.0313 0.8598
0.9309 0.5844 1900 0.9857 0.8557
1.4494 0.6152 2000 1.0370 0.9009
0.9489 0.6460 2100 0.9802 0.8449
1.5404 0.6767 2200 1.0277 0.8730
0.93 0.7075 2300 0.9490 0.8214
1.5351 0.7382 2400 1.0075 0.8489
0.86 0.7690 2500 0.9466 0.8278
1.4075 0.7998 2600 0.9908 0.8450
0.8978 0.8305 2700 0.9783 0.8193
1.5049 0.8613 2800 0.9776 0.8263
0.8631 0.8920 2900 0.9383 0.8185
1.4381 0.9228 3000 0.9541 0.8292
0.9529 0.9536 3100 0.9412 0.8297
1.4904 0.9843 3200 0.9731 0.8218
0.9834 1.0151 3300 0.9499 0.8034
1.0134 1.0458 3400 0.9317 0.8129
1.2938 1.0766 3500 0.9105 0.8237
0.9878 1.1074 3600 0.9214 0.8042
1.1626 1.1381 3700 0.9045 0.8081
1.0864 1.1689 3800 0.9101 0.8063
1.21 1.1996 3900 0.9097 0.7937
1.0242 1.2304 4000 0.9051 0.7949
1.1935 1.2612 4100 0.9088 0.8090
0.9784 1.2919 4200 0.9132 0.8008
1.1719 1.3227 4300 0.8861 0.7930
1.0924 1.3534 4400 0.8914 0.8075
1.3006 1.3842 4500 0.8807 0.7852
0.9768 1.4149 4600 0.8859 0.7892
1.1658 1.4457 4700 0.8922 0.7870
0.9708 1.4765 4800 0.8684 0.7819
1.2968 1.5072 4900 0.8680 0.7780
1.0397 1.5380 5000 0.8960 0.7854
1.3094 1.5687 5100 0.8742 0.7815
0.9274 1.5995 5200 0.8732 0.7778
1.2538 1.6303 5300 0.8639 0.7721
1.0089 1.6610 5400 0.8713 0.7862
1.2453 1.6918 5500 0.8620 0.7782
0.9521 1.7225 5600 0.8551 0.7830
1.3161 1.7533 5700 0.8577 0.7748
1.0274 1.7841 5800 0.8573 0.7795
1.2341 1.8148 5900 0.8379 0.7600
0.9222 1.8456 6000 0.8630 0.7726
1.0861 1.8763 6100 0.8489 0.7634
0.9341 1.9071 6200 0.8411 0.7606
1.1717 1.9379 6300 0.8506 0.7747
0.9205 1.9686 6400 0.8626 0.7655
1.1891 1.9994 6500 0.8571 0.7706
0.8629 2.0301 6600 0.8422 0.7666
1.1854 2.0609 6700 0.8483 0.7644
0.8767 2.0917 6800 0.8215 0.7536
1.2317 2.1224 6900 0.8582 0.7751
0.8572 2.1532 7000 0.8308 0.7569
1.2478 2.1839 7100 0.8172 0.7457
0.8012 2.2147 7200 0.8261 0.7624
1.3722 2.2455 7300 0.8543 0.7608
0.8545 2.2762 7400 0.8267 0.7617
1.1439 2.3070 7500 0.8354 0.7565
0.7752 2.3377 7600 0.8200 0.7540
1.3572 2.3685 7700 0.8276 0.7524
0.7083 2.3993 7800 0.8268 0.7502
1.2719 2.4300 7900 0.8276 0.7600
0.7618 2.4608 8000 0.8153 0.7540
1.1886 2.4915 8100 0.8194 0.7528
0.7902 2.5223 8200 0.8111 0.7518
1.2395 2.5531 8300 0.8252 0.7410
0.8243 2.5838 8400 0.8093 0.7439
1.2454 2.6146 8500 0.8115 0.7400
0.8353 2.6453 8600 0.8060 0.7370
1.1679 2.6761 8700 0.8114 0.7539
0.7987 2.7069 8800 0.8039 0.7467
1.1435 2.7376 8900 0.8152 0.7474
0.8146 2.7684 9000 0.7977 0.7476
1.1233 2.7991 9100 0.7973 0.7388
0.7887 2.8299 9200 0.7969 0.7398
1.2149 2.8607 9300 0.8013 0.7511
0.775 2.8914 9400 0.7857 0.7328
1.2072 2.9222 9500 0.7999 0.7357
0.7742 2.9529 9600 0.7901 0.7322
1.2508 2.9837 9700 0.7914 0.7368
0.8621 3.0145 9800 0.7836 0.7402
0.8507 3.0452 9900 0.7790 0.7290
1.0266 3.0760 10000 0.7882 0.7329
0.8631 3.1067 10100 0.7882 0.7281
1.1433 3.1375 10200 0.7823 0.7307
0.8892 3.1683 10300 0.7873 0.7321
1.0211 3.1990 10400 0.7836 0.7273
0.9307 3.2298 10500 0.7818 0.7248
1.1533 3.2605 10600 0.7757 0.7266
0.8969 3.2913 10700 0.7786 0.7198
1.0713 3.3221 10800 0.7787 0.7170
0.8816 3.3528 10900 0.7803 0.7199
1.1048 3.3836 11000 0.7777 0.7199
0.8707 3.4143 11100 0.7750 0.7225
0.9713 3.4451 11200 0.7789 0.7177
0.7791 3.4759 11300 0.7733 0.7170
1.0315 3.5066 11400 0.7691 0.7187
0.7899 3.5374 11500 0.7681 0.7180
0.9405 3.5681 11600 0.7661 0.7175
0.8365 3.5989 11700 0.7629 0.7196
1.1294 3.6297 11800 0.7638 0.7153
0.8537 3.6604 11900 0.7647 0.7114
1.1291 3.6912 12000 0.7593 0.7162
0.7939 3.7219 12100 0.7618 0.7110
1.1158 3.7527 12200 0.7617 0.7128
0.8674 3.7835 12300 0.7583 0.7118
1.0904 3.8142 12400 0.7600 0.7138
0.779 3.8450 12500 0.7574 0.7116
0.9692 3.8757 12600 0.7569 0.7116
0.7636 3.9065 12700 0.7564 0.7138
1.0034 3.9373 12800 0.7577 0.7138
0.7543 3.9680 12900 0.7566 0.7140
1.0205 3.9988 13000 0.7563 0.7135

Framework versions

  • Transformers 4.52.0.dev0
  • Pytorch 2.6.0+cu124
  • Datasets 3.5.1
  • Tokenizers 0.21.1
Downloads last month
57
Safetensors
Model size
965M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for siberian-lang-lab/wav2vec2-large-mms-1b-evenki-colab

Finetuned
(292)
this model