mms-1b-all-bigcgen-baseline-42
This model is a fine-tuned version of facebook/mms-1b-all on the BIGCGEN - BEM dataset. It achieves the following results on the evaluation set:
- Loss: 0.5951
- Wer: 0.5486
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 8
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 30.0
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
8.4023 | 0.6116 | 100 | 4.1765 | 1.0208 |
3.9652 | 1.2202 | 200 | 3.6936 | 1.0021 |
3.7644 | 1.8318 | 300 | 3.4173 | 1.0244 |
3.4164 | 2.4404 | 400 | 3.1571 | 1.0 |
3.3702 | 3.0489 | 500 | 2.9857 | 0.9998 |
3.1816 | 3.6606 | 600 | 2.8341 | 0.9998 |
2.8067 | 4.2691 | 700 | 1.1324 | 0.9435 |
0.9841 | 4.8807 | 800 | 0.6674 | 0.6082 |
0.7858 | 5.4893 | 900 | 0.6411 | 0.5486 |
0.8181 | 6.0979 | 1000 | 0.6115 | 0.5394 |
0.753 | 6.7095 | 1100 | 0.6057 | 0.5283 |
0.7433 | 7.3180 | 1200 | 0.5951 | 0.5479 |
0.7278 | 7.9297 | 1300 | 0.5814 | 0.5259 |
0.7323 | 8.5382 | 1400 | 0.5801 | 0.5193 |
0.6886 | 9.1468 | 1500 | 0.5578 | 0.5148 |
0.6929 | 9.7584 | 1600 | 0.5602 | 0.5065 |
0.6707 | 10.3670 | 1700 | 0.5474 | 0.4968 |
0.6633 | 10.9786 | 1800 | 0.5417 | 0.4947 |
0.6331 | 11.5872 | 1900 | 0.5444 | 0.4933 |
0.6713 | 12.1957 | 2000 | 0.5960 | 0.5701 |
0.6394 | 12.8073 | 2100 | 0.5287 | 0.4840 |
0.6282 | 13.4159 | 2200 | 0.5611 | 0.5254 |
0.5982 | 14.0245 | 2300 | 0.5317 | 0.4829 |
0.6165 | 14.6361 | 2400 | 0.5370 | 0.4937 |
0.6077 | 15.2446 | 2500 | 0.5381 | 0.4935 |
Framework versions
- Transformers 4.53.0.dev0
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.21.0
- Downloads last month
- 14
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for csikasote/mms-1b-all-bigcgen-baseline-42
Base model
facebook/mms-1b-all