roberta-finetuned-wines
This model is a fine-tuned version of roberta-base on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 5.4665
- Accuracy: 0.1073
- F1: 0.0750
- Precision: 0.4777
- Recall: 0.2527
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 5
- num_epochs: 150
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
---|---|---|---|---|---|---|---|
7.8629 | 1.0 | 405 | 7.8525 | 0.0009 | 0.0000 | 0.9983 | 0.0004 |
7.7872 | 2.0 | 810 | 7.7982 | 0.0046 | 0.0002 | 0.9798 | 0.0048 |
7.6725 | 3.0 | 1215 | 7.6970 | 0.0071 | 0.0003 | 0.9716 | 0.0068 |
7.5418 | 4.0 | 1620 | 7.5861 | 0.0121 | 0.0010 | 0.9590 | 0.0127 |
7.3997 | 5.0 | 2025 | 7.4690 | 0.0161 | 0.0026 | 0.9473 | 0.0192 |
7.2604 | 6.0 | 2430 | 7.3499 | 0.0210 | 0.0035 | 0.9445 | 0.0217 |
7.1155 | 7.0 | 2835 | 7.2366 | 0.0232 | 0.0042 | 0.9364 | 0.0287 |
6.9688 | 8.0 | 3240 | 7.1219 | 0.0281 | 0.0055 | 0.9341 | 0.0315 |
6.824 | 9.0 | 3645 | 7.0155 | 0.0285 | 0.0056 | 0.9450 | 0.0279 |
6.6814 | 10.0 | 4050 | 6.9050 | 0.0322 | 0.0072 | 0.9330 | 0.0339 |
6.5493 | 11.0 | 4455 | 6.8102 | 0.0328 | 0.0080 | 0.9323 | 0.0329 |
6.4221 | 12.0 | 4860 | 6.7147 | 0.0374 | 0.0106 | 0.9266 | 0.0395 |
6.2916 | 13.0 | 5265 | 6.6268 | 0.0377 | 0.0111 | 0.9203 | 0.0398 |
6.1736 | 14.0 | 5670 | 6.5492 | 0.0430 | 0.0142 | 0.9173 | 0.0484 |
6.0584 | 15.0 | 6075 | 6.4759 | 0.0414 | 0.0105 | 0.9154 | 0.0449 |
5.9497 | 16.0 | 6480 | 6.4023 | 0.0458 | 0.0154 | 0.9090 | 0.0499 |
5.8428 | 17.0 | 6885 | 6.3424 | 0.0455 | 0.0155 | 0.9073 | 0.0508 |
5.742 | 18.0 | 7290 | 6.2803 | 0.0467 | 0.0149 | 0.8990 | 0.0537 |
5.6494 | 19.0 | 7695 | 6.2155 | 0.0492 | 0.0176 | 0.8882 | 0.0592 |
5.5565 | 20.0 | 8100 | 6.1614 | 0.0510 | 0.0176 | 0.8872 | 0.0630 |
5.4674 | 21.0 | 8505 | 6.1120 | 0.0520 | 0.0190 | 0.8791 | 0.0647 |
5.3793 | 22.0 | 8910 | 6.0617 | 0.0557 | 0.0206 | 0.8772 | 0.0687 |
5.2992 | 23.0 | 9315 | 6.0214 | 0.0551 | 0.0200 | 0.8702 | 0.0704 |
5.2134 | 24.0 | 9720 | 5.9748 | 0.0551 | 0.0194 | 0.8687 | 0.0719 |
5.1365 | 25.0 | 10125 | 5.9350 | 0.0594 | 0.0228 | 0.8555 | 0.0798 |
5.0656 | 26.0 | 10530 | 5.8965 | 0.0597 | 0.0228 | 0.8582 | 0.0791 |
4.9908 | 27.0 | 10935 | 5.8596 | 0.0600 | 0.0240 | 0.8467 | 0.0841 |
4.92 | 28.0 | 11340 | 5.8329 | 0.0622 | 0.0252 | 0.8453 | 0.0862 |
4.8491 | 29.0 | 11745 | 5.8045 | 0.0643 | 0.0268 | 0.8407 | 0.0932 |
4.786 | 30.0 | 12150 | 5.7649 | 0.0637 | 0.0269 | 0.8295 | 0.0930 |
4.7202 | 31.0 | 12555 | 5.7437 | 0.0643 | 0.0280 | 0.8251 | 0.0972 |
4.6521 | 32.0 | 12960 | 5.7112 | 0.0659 | 0.0306 | 0.8154 | 0.1004 |
4.5934 | 33.0 | 13365 | 5.6903 | 0.0677 | 0.0292 | 0.8079 | 0.1048 |
4.5291 | 34.0 | 13770 | 5.6676 | 0.0674 | 0.0317 | 0.8060 | 0.1111 |
4.4668 | 35.0 | 14175 | 5.6414 | 0.0690 | 0.0321 | 0.7960 | 0.1116 |
4.4066 | 36.0 | 14580 | 5.6280 | 0.0705 | 0.0330 | 0.7894 | 0.1153 |
4.3487 | 37.0 | 14985 | 5.6050 | 0.0705 | 0.0324 | 0.7812 | 0.1166 |
4.2885 | 38.0 | 15390 | 5.5923 | 0.0718 | 0.0347 | 0.7768 | 0.1196 |
4.2307 | 39.0 | 15795 | 5.5675 | 0.0739 | 0.0379 | 0.7718 | 0.1222 |
4.1779 | 40.0 | 16200 | 5.5597 | 0.0755 | 0.0376 | 0.7670 | 0.1271 |
4.1253 | 41.0 | 16605 | 5.5310 | 0.0758 | 0.0379 | 0.7599 | 0.1306 |
4.0689 | 42.0 | 17010 | 5.5214 | 0.0767 | 0.0386 | 0.7585 | 0.1311 |
4.0148 | 43.0 | 17415 | 5.5017 | 0.0789 | 0.0405 | 0.7539 | 0.1393 |
3.9639 | 44.0 | 17820 | 5.4964 | 0.0795 | 0.0417 | 0.7513 | 0.1364 |
3.9117 | 45.0 | 18225 | 5.4763 | 0.0810 | 0.0428 | 0.7396 | 0.1417 |
3.8563 | 46.0 | 18630 | 5.4707 | 0.0817 | 0.0436 | 0.7384 | 0.1445 |
3.8068 | 47.0 | 19035 | 5.4614 | 0.0810 | 0.0444 | 0.7285 | 0.1440 |
3.7598 | 48.0 | 19440 | 5.4481 | 0.0832 | 0.0467 | 0.7265 | 0.1498 |
3.7076 | 49.0 | 19845 | 5.4413 | 0.0823 | 0.0465 | 0.7230 | 0.1467 |
3.6624 | 50.0 | 20250 | 5.4301 | 0.0848 | 0.0478 | 0.7209 | 0.1544 |
3.6122 | 51.0 | 20655 | 5.4214 | 0.0866 | 0.0486 | 0.7139 | 0.1571 |
3.5628 | 52.0 | 21060 | 5.4193 | 0.0835 | 0.0477 | 0.7048 | 0.1545 |
3.5184 | 53.0 | 21465 | 5.4066 | 0.0838 | 0.0473 | 0.6928 | 0.1598 |
3.466 | 54.0 | 21870 | 5.4030 | 0.0863 | 0.0494 | 0.6899 | 0.1641 |
3.4169 | 55.0 | 22275 | 5.3990 | 0.0863 | 0.0497 | 0.6879 | 0.1635 |
3.3812 | 56.0 | 22680 | 5.3931 | 0.0882 | 0.0506 | 0.6780 | 0.1695 |
3.3313 | 57.0 | 23085 | 5.3860 | 0.0875 | 0.0518 | 0.6729 | 0.1723 |
3.2886 | 58.0 | 23490 | 5.3741 | 0.0878 | 0.0501 | 0.6670 | 0.1732 |
3.2385 | 59.0 | 23895 | 5.3751 | 0.0878 | 0.0513 | 0.6606 | 0.1739 |
3.1984 | 60.0 | 24300 | 5.3793 | 0.0872 | 0.0520 | 0.6548 | 0.1751 |
3.1555 | 61.0 | 24705 | 5.3686 | 0.0885 | 0.0518 | 0.6540 | 0.1775 |
3.1145 | 62.0 | 25110 | 5.3632 | 0.0900 | 0.0537 | 0.6493 | 0.1824 |
3.0739 | 63.0 | 25515 | 5.3644 | 0.0900 | 0.0547 | 0.6473 | 0.1843 |
3.0375 | 64.0 | 25920 | 5.3625 | 0.0900 | 0.0529 | 0.6392 | 0.1819 |
2.9894 | 65.0 | 26325 | 5.3602 | 0.0937 | 0.0564 | 0.6322 | 0.1883 |
2.9475 | 66.0 | 26730 | 5.3619 | 0.0940 | 0.0579 | 0.6357 | 0.1878 |
2.9078 | 67.0 | 27135 | 5.3502 | 0.0928 | 0.0582 | 0.6246 | 0.1899 |
2.8723 | 68.0 | 27540 | 5.3495 | 0.0940 | 0.0571 | 0.6283 | 0.1886 |
2.8288 | 69.0 | 27945 | 5.3416 | 0.0943 | 0.0583 | 0.6169 | 0.1917 |
2.792 | 70.0 | 28350 | 5.3465 | 0.0946 | 0.0582 | 0.6119 | 0.1946 |
2.7566 | 71.0 | 28755 | 5.3492 | 0.0981 | 0.0600 | 0.6071 | 0.2005 |
2.7142 | 72.0 | 29160 | 5.3448 | 0.0981 | 0.0615 | 0.6106 | 0.1970 |
2.6816 | 73.0 | 29565 | 5.3388 | 0.0968 | 0.0614 | 0.6065 | 0.1974 |
2.6467 | 74.0 | 29970 | 5.3422 | 0.0987 | 0.0610 | 0.5998 | 0.2034 |
2.6077 | 75.0 | 30375 | 5.3435 | 0.0984 | 0.0619 | 0.5979 | 0.2030 |
2.5747 | 76.0 | 30780 | 5.3452 | 0.0968 | 0.0618 | 0.5959 | 0.2003 |
2.5382 | 77.0 | 31185 | 5.3526 | 0.0977 | 0.0608 | 0.5895 | 0.1997 |
2.5074 | 78.0 | 31590 | 5.3430 | 0.0971 | 0.0615 | 0.5841 | 0.2064 |
2.4694 | 79.0 | 31995 | 5.3450 | 0.0987 | 0.0630 | 0.5791 | 0.2079 |
2.4394 | 80.0 | 32400 | 5.3446 | 0.1024 | 0.0666 | 0.5791 | 0.2134 |
2.4071 | 81.0 | 32805 | 5.3481 | 0.1011 | 0.0652 | 0.5786 | 0.2099 |
2.3735 | 82.0 | 33210 | 5.3462 | 0.1002 | 0.0651 | 0.5657 | 0.2139 |
2.3436 | 83.0 | 33615 | 5.3474 | 0.1018 | 0.0667 | 0.5692 | 0.2142 |
2.3154 | 84.0 | 34020 | 5.3523 | 0.1042 | 0.0679 | 0.5705 | 0.2137 |
2.2796 | 85.0 | 34425 | 5.3603 | 0.1049 | 0.0682 | 0.5635 | 0.2207 |
2.2542 | 86.0 | 34830 | 5.3531 | 0.1052 | 0.0682 | 0.5631 | 0.2195 |
2.2225 | 87.0 | 35235 | 5.3524 | 0.1042 | 0.0686 | 0.5531 | 0.2262 |
2.1966 | 88.0 | 35640 | 5.3573 | 0.1067 | 0.0700 | 0.5457 | 0.2276 |
2.163 | 89.0 | 36045 | 5.3590 | 0.1045 | 0.0690 | 0.5470 | 0.2275 |
2.1355 | 90.0 | 36450 | 5.3646 | 0.1061 | 0.0696 | 0.5566 | 0.2233 |
2.1093 | 91.0 | 36855 | 5.3593 | 0.1055 | 0.0692 | 0.5457 | 0.2276 |
2.0832 | 92.0 | 37260 | 5.3639 | 0.1055 | 0.0702 | 0.5452 | 0.2306 |
2.0555 | 93.0 | 37665 | 5.3688 | 0.1052 | 0.0701 | 0.5440 | 0.2307 |
2.0363 | 94.0 | 38070 | 5.3708 | 0.1039 | 0.0683 | 0.5391 | 0.2277 |
2.007 | 95.0 | 38475 | 5.3690 | 0.1021 | 0.0667 | 0.5343 | 0.2288 |
1.9851 | 96.0 | 38880 | 5.3814 | 0.1045 | 0.0697 | 0.5360 | 0.2325 |
1.9581 | 97.0 | 39285 | 5.3766 | 0.1052 | 0.0702 | 0.5357 | 0.2339 |
1.9383 | 98.0 | 39690 | 5.3742 | 0.1058 | 0.0706 | 0.5269 | 0.2355 |
1.9118 | 99.0 | 40095 | 5.3754 | 0.1073 | 0.0709 | 0.5229 | 0.2357 |
1.8877 | 100.0 | 40500 | 5.3766 | 0.1073 | 0.0702 | 0.5188 | 0.2405 |
1.8697 | 101.0 | 40905 | 5.3894 | 0.1049 | 0.0702 | 0.5260 | 0.2335 |
1.8469 | 102.0 | 41310 | 5.3910 | 0.1058 | 0.0700 | 0.5177 | 0.2378 |
1.8239 | 103.0 | 41715 | 5.3959 | 0.1083 | 0.0715 | 0.5285 | 0.2365 |
1.8058 | 104.0 | 42120 | 5.3928 | 0.1076 | 0.0716 | 0.5228 | 0.2368 |
1.7831 | 105.0 | 42525 | 5.3927 | 0.1076 | 0.0717 | 0.5154 | 0.2400 |
1.7669 | 106.0 | 42930 | 5.4024 | 0.1079 | 0.0718 | 0.5143 | 0.2410 |
1.7492 | 107.0 | 43335 | 5.4043 | 0.1067 | 0.0715 | 0.5169 | 0.2398 |
1.7252 | 108.0 | 43740 | 5.4017 | 0.1070 | 0.0724 | 0.5117 | 0.2400 |
1.7109 | 109.0 | 44145 | 5.4030 | 0.1079 | 0.0726 | 0.5158 | 0.2408 |
1.6932 | 110.0 | 44550 | 5.4107 | 0.1079 | 0.0724 | 0.5092 | 0.2420 |
1.6714 | 111.0 | 44955 | 5.4031 | 0.1079 | 0.0733 | 0.5043 | 0.2438 |
1.6525 | 112.0 | 45360 | 5.4127 | 0.1101 | 0.0735 | 0.5065 | 0.2441 |
1.6397 | 113.0 | 45765 | 5.4099 | 0.1098 | 0.0749 | 0.5102 | 0.2436 |
1.6289 | 114.0 | 46170 | 5.4149 | 0.1083 | 0.0739 | 0.5009 | 0.2455 |
1.6115 | 115.0 | 46575 | 5.4189 | 0.1083 | 0.0738 | 0.5035 | 0.2416 |
1.5932 | 116.0 | 46980 | 5.4228 | 0.1101 | 0.0746 | 0.5018 | 0.2479 |
1.5798 | 117.0 | 47385 | 5.4271 | 0.1104 | 0.0745 | 0.5010 | 0.2466 |
1.5593 | 118.0 | 47790 | 5.4306 | 0.1095 | 0.0754 | 0.4981 | 0.2480 |
1.5517 | 119.0 | 48195 | 5.4347 | 0.1095 | 0.0743 | 0.4973 | 0.2484 |
1.5436 | 120.0 | 48600 | 5.4350 | 0.1098 | 0.0746 | 0.4974 | 0.2471 |
1.5258 | 121.0 | 49005 | 5.4318 | 0.1083 | 0.0750 | 0.4958 | 0.2472 |
1.5128 | 122.0 | 49410 | 5.4357 | 0.1083 | 0.0749 | 0.4961 | 0.2474 |
1.4983 | 123.0 | 49815 | 5.4365 | 0.1104 | 0.0761 | 0.4949 | 0.2508 |
1.4891 | 124.0 | 50220 | 5.4410 | 0.1076 | 0.0737 | 0.4962 | 0.2483 |
1.4829 | 125.0 | 50625 | 5.4393 | 0.1079 | 0.0749 | 0.4906 | 0.2471 |
1.4669 | 126.0 | 51030 | 5.4462 | 0.1095 | 0.0755 | 0.4891 | 0.2494 |
1.4594 | 127.0 | 51435 | 5.4446 | 0.1076 | 0.0750 | 0.4893 | 0.2529 |
1.447 | 128.0 | 51840 | 5.4466 | 0.1098 | 0.0758 | 0.4889 | 0.2507 |
1.4386 | 129.0 | 52245 | 5.4454 | 0.1095 | 0.0759 | 0.4864 | 0.2510 |
1.4294 | 130.0 | 52650 | 5.4495 | 0.1095 | 0.0766 | 0.4882 | 0.2534 |
1.4179 | 131.0 | 53055 | 5.4516 | 0.1092 | 0.0756 | 0.4859 | 0.2512 |
1.4143 | 132.0 | 53460 | 5.4479 | 0.1089 | 0.0756 | 0.4820 | 0.2545 |
1.404 | 133.0 | 53865 | 5.4548 | 0.1101 | 0.0763 | 0.4868 | 0.2529 |
1.3962 | 134.0 | 54270 | 5.4582 | 0.1083 | 0.0754 | 0.4881 | 0.2518 |
1.3937 | 135.0 | 54675 | 5.4578 | 0.1079 | 0.0750 | 0.4843 | 0.2530 |
1.3887 | 136.0 | 55080 | 5.4570 | 0.1067 | 0.0743 | 0.4805 | 0.2534 |
1.3796 | 137.0 | 55485 | 5.4580 | 0.1089 | 0.0758 | 0.4836 | 0.2521 |
1.3739 | 138.0 | 55890 | 5.4598 | 0.1079 | 0.0750 | 0.4813 | 0.2523 |
1.3702 | 139.0 | 56295 | 5.4617 | 0.1076 | 0.0752 | 0.4829 | 0.2504 |
1.3621 | 140.0 | 56700 | 5.4622 | 0.1079 | 0.0755 | 0.4800 | 0.2522 |
1.355 | 141.0 | 57105 | 5.4628 | 0.1083 | 0.0756 | 0.4842 | 0.2524 |
1.3497 | 142.0 | 57510 | 5.4644 | 0.1073 | 0.0746 | 0.4782 | 0.2535 |
1.3521 | 143.0 | 57915 | 5.4643 | 0.1076 | 0.0751 | 0.4813 | 0.2521 |
1.3486 | 144.0 | 58320 | 5.4641 | 0.1064 | 0.0744 | 0.4771 | 0.2526 |
1.3441 | 145.0 | 58725 | 5.4670 | 0.1079 | 0.0754 | 0.4791 | 0.2535 |
1.3399 | 146.0 | 59130 | 5.4661 | 0.1070 | 0.0745 | 0.4784 | 0.2532 |
1.3403 | 147.0 | 59535 | 5.4670 | 0.1073 | 0.0750 | 0.4786 | 0.2526 |
1.3352 | 148.0 | 59940 | 5.4663 | 0.1076 | 0.0749 | 0.4791 | 0.2532 |
1.336 | 149.0 | 60345 | 5.4664 | 0.1073 | 0.0750 | 0.4789 | 0.2519 |
1.3323 | 150.0 | 60750 | 5.4665 | 0.1073 | 0.0750 | 0.4777 | 0.2527 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 1,729
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for dimitarpg13/roberta-finetuned-wines
Base model
FacebookAI/roberta-base