BART-OM-woprefix
This model is a fine-tuned version of facebook/bart-base on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.0624
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
8.2296 | 0.0427 | 10 | 6.3098 |
5.8265 | 0.0855 | 20 | 4.2405 |
4.1489 | 0.1282 | 30 | 3.0542 |
3.2292 | 0.1709 | 40 | 2.4413 |
2.6792 | 0.2137 | 50 | 2.0664 |
2.2966 | 0.2564 | 60 | 1.7725 |
2.0282 | 0.2991 | 70 | 1.5242 |
1.7674 | 0.3419 | 80 | 1.2952 |
1.5055 | 0.3846 | 90 | 1.0840 |
1.2755 | 0.4274 | 100 | 0.8667 |
1.0735 | 0.4701 | 110 | 0.6837 |
0.8622 | 0.5128 | 120 | 0.5300 |
0.6948 | 0.5556 | 130 | 0.3969 |
0.5288 | 0.5983 | 140 | 0.2963 |
0.4008 | 0.6410 | 150 | 0.2250 |
0.3098 | 0.6838 | 160 | 0.1712 |
0.2326 | 0.7265 | 170 | 0.1459 |
0.1835 | 0.7692 | 180 | 0.1129 |
0.1489 | 0.8120 | 190 | 0.0956 |
0.1251 | 0.8547 | 200 | 0.0898 |
0.1164 | 0.8974 | 210 | 0.0838 |
0.1027 | 0.9402 | 220 | 0.0749 |
0.0962 | 0.9829 | 230 | 0.0781 |
0.0885 | 1.0256 | 240 | 0.0713 |
0.0881 | 1.0684 | 250 | 0.0690 |
0.0838 | 1.1111 | 260 | 0.0673 |
0.082 | 1.1538 | 270 | 0.0686 |
0.0809 | 1.1966 | 280 | 0.0676 |
0.0793 | 1.2393 | 290 | 0.0672 |
0.0747 | 1.2821 | 300 | 0.0669 |
0.0754 | 1.3248 | 310 | 0.0666 |
0.0791 | 1.3675 | 320 | 0.0704 |
0.0719 | 1.4103 | 330 | 0.0645 |
0.0724 | 1.4530 | 340 | 0.0656 |
0.073 | 1.4957 | 350 | 0.0694 |
0.0785 | 1.5385 | 360 | 0.0668 |
0.0709 | 1.5812 | 370 | 0.0646 |
0.0721 | 1.6239 | 380 | 0.0634 |
0.0729 | 1.6667 | 390 | 0.0636 |
0.0689 | 1.7094 | 400 | 0.0644 |
0.0712 | 1.7521 | 410 | 0.0639 |
0.0709 | 1.7949 | 420 | 0.0639 |
0.0745 | 1.8376 | 430 | 0.0694 |
0.0802 | 1.8803 | 440 | 0.0667 |
0.0736 | 1.9231 | 450 | 0.0682 |
0.0704 | 1.9658 | 460 | 0.0656 |
0.0698 | 2.0085 | 470 | 0.0653 |
0.0685 | 2.0513 | 480 | 0.0646 |
0.066 | 2.0940 | 490 | 0.0633 |
0.0664 | 2.1368 | 500 | 0.0614 |
0.0702 | 2.1795 | 510 | 0.0663 |
0.068 | 2.2222 | 520 | 0.0679 |
0.0631 | 2.2650 | 530 | 0.0651 |
0.0699 | 2.3077 | 540 | 0.0639 |
0.0679 | 2.3504 | 550 | 0.0624 |
0.0659 | 2.3932 | 560 | 0.0616 |
0.0684 | 2.4359 | 570 | 0.0645 |
0.0659 | 2.4786 | 580 | 0.0684 |
0.0684 | 2.5214 | 590 | 0.0631 |
0.0688 | 2.5641 | 600 | 0.0617 |
0.0646 | 2.6068 | 610 | 0.0619 |
0.0632 | 2.6496 | 620 | 0.0622 |
0.0658 | 2.6923 | 630 | 0.0621 |
0.0642 | 2.7350 | 640 | 0.0646 |
0.0615 | 2.7778 | 650 | 0.0625 |
0.0704 | 2.8205 | 660 | 0.0605 |
0.0652 | 2.8632 | 670 | 0.0647 |
0.059 | 2.9060 | 680 | 0.0623 |
0.062 | 2.9487 | 690 | 0.0609 |
0.0593 | 2.9915 | 700 | 0.0588 |
0.0571 | 3.0342 | 710 | 0.0631 |
0.0631 | 3.0769 | 720 | 0.0630 |
0.0637 | 3.1197 | 730 | 0.0630 |
0.0615 | 3.1624 | 740 | 0.0616 |
0.0585 | 3.2051 | 750 | 0.0612 |
0.0589 | 3.2479 | 760 | 0.0635 |
0.0613 | 3.2906 | 770 | 0.0605 |
0.062 | 3.3333 | 780 | 0.0613 |
0.066 | 3.3761 | 790 | 0.0614 |
0.062 | 3.4188 | 800 | 0.0609 |
0.0613 | 3.4615 | 810 | 0.0645 |
0.0601 | 3.5043 | 820 | 0.0609 |
0.0603 | 3.5470 | 830 | 0.0638 |
0.0638 | 3.5897 | 840 | 0.0618 |
0.0561 | 3.6325 | 850 | 0.0632 |
0.0555 | 3.6752 | 860 | 0.0630 |
0.0601 | 3.7179 | 870 | 0.0622 |
0.06 | 3.7607 | 880 | 0.0609 |
0.0582 | 3.8034 | 890 | 0.0627 |
0.0583 | 3.8462 | 900 | 0.0605 |
0.059 | 3.8889 | 910 | 0.0615 |
0.0617 | 3.9316 | 920 | 0.0634 |
0.0562 | 3.9744 | 930 | 0.0617 |
0.0572 | 4.0171 | 940 | 0.0606 |
0.0564 | 4.0598 | 950 | 0.0612 |
0.0602 | 4.1026 | 960 | 0.0613 |
0.0601 | 4.1453 | 970 | 0.0653 |
0.0584 | 4.1880 | 980 | 0.0618 |
0.0579 | 4.2308 | 990 | 0.0622 |
0.0579 | 4.2735 | 1000 | 0.0629 |
0.0554 | 4.3162 | 1010 | 0.0615 |
0.0563 | 4.3590 | 1020 | 0.0633 |
0.0573 | 4.4017 | 1030 | 0.0622 |
0.0576 | 4.4444 | 1040 | 0.0606 |
0.059 | 4.4872 | 1050 | 0.0602 |
0.0609 | 4.5299 | 1060 | 0.0637 |
0.0603 | 4.5726 | 1070 | 0.0617 |
0.0591 | 4.6154 | 1080 | 0.0631 |
0.0566 | 4.6581 | 1090 | 0.0614 |
0.06 | 4.7009 | 1100 | 0.0631 |
0.0574 | 4.7436 | 1110 | 0.0637 |
0.0618 | 4.7863 | 1120 | 0.0614 |
0.0568 | 4.8291 | 1130 | 0.0612 |
0.054 | 4.8718 | 1140 | 0.0613 |
0.0534 | 4.9145 | 1150 | 0.0615 |
0.0541 | 4.9573 | 1160 | 0.0622 |
0.0594 | 5.0 | 1170 | 0.0624 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.21.1
- Downloads last month
- 7
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for tlam25/BART-OM-woprefix
Base model
facebook/bart-base