distilbert-finetuned-wines-test
This model is a fine-tuned version of distilbert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 7.6406
- Accuracy: 0.1073
- F1: 0.0763
- Precision: 0.3892
- Recall: 0.2741
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 5
- num_epochs: 150
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
---|---|---|---|---|---|---|---|
7.8568 | 1.0 | 405 | 7.8441 | 0.0003 | 0.0000 | 0.9819 | 0.0051 |
7.7437 | 2.0 | 810 | 7.7517 | 0.0071 | 0.0004 | 0.9674 | 0.0098 |
7.5479 | 3.0 | 1215 | 7.6125 | 0.0118 | 0.0013 | 0.9686 | 0.0115 |
7.3247 | 4.0 | 1620 | 7.4582 | 0.0139 | 0.0010 | 0.9717 | 0.0107 |
7.1027 | 5.0 | 2025 | 7.3135 | 0.0179 | 0.0017 | 0.9708 | 0.0137 |
6.9025 | 6.0 | 2430 | 7.1793 | 0.0207 | 0.0018 | 0.9652 | 0.0153 |
6.7071 | 7.0 | 2835 | 7.0558 | 0.0226 | 0.0020 | 0.9620 | 0.0180 |
6.5319 | 8.0 | 3240 | 6.9368 | 0.0278 | 0.0037 | 0.9506 | 0.0245 |
6.3619 | 9.0 | 3645 | 6.8341 | 0.0291 | 0.0047 | 0.9427 | 0.0270 |
6.2025 | 10.0 | 4050 | 6.7228 | 0.0334 | 0.0073 | 0.9401 | 0.0328 |
6.044 | 11.0 | 4455 | 6.6184 | 0.0328 | 0.0077 | 0.9311 | 0.0354 |
5.8924 | 12.0 | 4860 | 6.5154 | 0.0418 | 0.0131 | 0.9207 | 0.0451 |
5.7511 | 13.0 | 5265 | 6.4240 | 0.0421 | 0.0126 | 0.9153 | 0.0473 |
5.6025 | 14.0 | 5670 | 6.3483 | 0.0445 | 0.0136 | 0.9063 | 0.0509 |
5.4645 | 15.0 | 6075 | 6.2641 | 0.0479 | 0.0154 | 0.8968 | 0.0573 |
5.3421 | 16.0 | 6480 | 6.1821 | 0.0507 | 0.0181 | 0.8882 | 0.0627 |
5.2152 | 17.0 | 6885 | 6.1202 | 0.0523 | 0.0190 | 0.8753 | 0.0714 |
5.0919 | 18.0 | 7290 | 6.0610 | 0.0554 | 0.0213 | 0.8624 | 0.0772 |
4.9768 | 19.0 | 7695 | 5.9870 | 0.0578 | 0.0244 | 0.8476 | 0.0807 |
4.8617 | 20.0 | 8100 | 5.9313 | 0.0591 | 0.0269 | 0.8403 | 0.0860 |
4.759 | 21.0 | 8505 | 5.8901 | 0.0606 | 0.0281 | 0.8300 | 0.0910 |
4.6531 | 22.0 | 8910 | 5.8281 | 0.0631 | 0.0295 | 0.8178 | 0.0986 |
4.5511 | 23.0 | 9315 | 5.8019 | 0.0637 | 0.0298 | 0.8040 | 0.1054 |
4.4603 | 24.0 | 9720 | 5.7640 | 0.0677 | 0.0320 | 0.7931 | 0.1135 |
4.3631 | 25.0 | 10125 | 5.7210 | 0.0705 | 0.0351 | 0.7801 | 0.1184 |
4.2723 | 26.0 | 10530 | 5.6958 | 0.0708 | 0.0354 | 0.7731 | 0.1211 |
4.175 | 27.0 | 10935 | 5.6639 | 0.0742 | 0.0386 | 0.7588 | 0.1305 |
4.0837 | 28.0 | 11340 | 5.6596 | 0.0770 | 0.0398 | 0.7544 | 0.1329 |
4.0081 | 29.0 | 11745 | 5.6313 | 0.0761 | 0.0398 | 0.7457 | 0.1377 |
3.9183 | 30.0 | 12150 | 5.6082 | 0.0798 | 0.0437 | 0.7369 | 0.1418 |
3.8424 | 31.0 | 12555 | 5.6055 | 0.0792 | 0.0429 | 0.7230 | 0.1464 |
3.7651 | 32.0 | 12960 | 5.5947 | 0.0838 | 0.0465 | 0.7149 | 0.1505 |
3.6876 | 33.0 | 13365 | 5.5751 | 0.0829 | 0.0455 | 0.6984 | 0.1540 |
3.611 | 34.0 | 13770 | 5.5798 | 0.0857 | 0.0494 | 0.6899 | 0.1608 |
3.5356 | 35.0 | 14175 | 5.5667 | 0.0872 | 0.0516 | 0.6789 | 0.1664 |
3.4539 | 36.0 | 14580 | 5.5712 | 0.0891 | 0.0524 | 0.6728 | 0.1673 |
3.3829 | 37.0 | 14985 | 5.5603 | 0.0906 | 0.0546 | 0.6542 | 0.1772 |
3.3048 | 38.0 | 15390 | 5.5663 | 0.0906 | 0.0562 | 0.6443 | 0.1783 |
3.2387 | 39.0 | 15795 | 5.5653 | 0.0922 | 0.0575 | 0.6317 | 0.1848 |
3.1642 | 40.0 | 16200 | 5.5645 | 0.0956 | 0.0587 | 0.6207 | 0.1888 |
3.1003 | 41.0 | 16605 | 5.5578 | 0.0959 | 0.0607 | 0.6133 | 0.1945 |
3.0319 | 42.0 | 17010 | 5.5892 | 0.0974 | 0.0625 | 0.6091 | 0.1975 |
2.962 | 43.0 | 17415 | 5.5974 | 0.0984 | 0.0625 | 0.6024 | 0.1979 |
2.8942 | 44.0 | 17820 | 5.5956 | 0.0968 | 0.0624 | 0.5926 | 0.2030 |
2.8184 | 45.0 | 18225 | 5.6040 | 0.1002 | 0.0656 | 0.5860 | 0.2081 |
2.7515 | 46.0 | 18630 | 5.6344 | 0.1024 | 0.0677 | 0.5857 | 0.2100 |
2.7019 | 47.0 | 19035 | 5.6464 | 0.1049 | 0.0674 | 0.5730 | 0.2150 |
2.6425 | 48.0 | 19440 | 5.6320 | 0.1049 | 0.0676 | 0.5607 | 0.2211 |
2.5707 | 49.0 | 19845 | 5.6883 | 0.1092 | 0.0718 | 0.5660 | 0.2258 |
2.5142 | 50.0 | 20250 | 5.6607 | 0.1089 | 0.0714 | 0.5553 | 0.2283 |
2.4466 | 51.0 | 20655 | 5.7018 | 0.1092 | 0.0724 | 0.5609 | 0.2252 |
2.3944 | 52.0 | 21060 | 5.7154 | 0.1079 | 0.0713 | 0.5491 | 0.2272 |
2.3366 | 53.0 | 21465 | 5.7255 | 0.1104 | 0.0738 | 0.5387 | 0.2356 |
2.2672 | 54.0 | 21870 | 5.7573 | 0.1086 | 0.0720 | 0.5350 | 0.2362 |
2.2248 | 55.0 | 22275 | 5.7714 | 0.1117 | 0.0761 | 0.5211 | 0.2415 |
2.1674 | 56.0 | 22680 | 5.7836 | 0.1114 | 0.0744 | 0.5257 | 0.2402 |
2.1058 | 57.0 | 23085 | 5.8001 | 0.1126 | 0.0775 | 0.5083 | 0.2451 |
2.0561 | 58.0 | 23490 | 5.8374 | 0.1141 | 0.0781 | 0.5173 | 0.2455 |
2.0085 | 59.0 | 23895 | 5.8578 | 0.1107 | 0.0764 | 0.5079 | 0.2437 |
1.9531 | 60.0 | 24300 | 5.8853 | 0.1089 | 0.0740 | 0.4941 | 0.2405 |
1.9042 | 61.0 | 24705 | 5.9047 | 0.1114 | 0.0766 | 0.4978 | 0.2445 |
1.8427 | 62.0 | 25110 | 5.9360 | 0.1089 | 0.0757 | 0.4994 | 0.2453 |
1.7987 | 63.0 | 25515 | 5.9686 | 0.1123 | 0.0773 | 0.4877 | 0.2494 |
1.7574 | 64.0 | 25920 | 5.9918 | 0.1107 | 0.0769 | 0.4823 | 0.2484 |
1.7153 | 65.0 | 26325 | 6.0089 | 0.1123 | 0.0794 | 0.4777 | 0.2513 |
1.6663 | 66.0 | 26730 | 6.0413 | 0.1101 | 0.0776 | 0.4764 | 0.2527 |
1.6224 | 67.0 | 27135 | 6.0659 | 0.1083 | 0.0752 | 0.4698 | 0.2483 |
1.5747 | 68.0 | 27540 | 6.0982 | 0.1110 | 0.0772 | 0.4570 | 0.2552 |
1.543 | 69.0 | 27945 | 6.1168 | 0.1114 | 0.0781 | 0.4574 | 0.2562 |
1.5021 | 70.0 | 28350 | 6.1321 | 0.1126 | 0.0805 | 0.4538 | 0.2600 |
1.4504 | 71.0 | 28755 | 6.2029 | 0.1117 | 0.0785 | 0.4537 | 0.2583 |
1.4256 | 72.0 | 29160 | 6.2433 | 0.1126 | 0.0790 | 0.4593 | 0.2576 |
1.3809 | 73.0 | 29565 | 6.2405 | 0.1104 | 0.0790 | 0.4510 | 0.2596 |
1.35 | 74.0 | 29970 | 6.2621 | 0.1104 | 0.0789 | 0.4448 | 0.2571 |
1.3175 | 75.0 | 30375 | 6.2996 | 0.1104 | 0.0783 | 0.4413 | 0.2580 |
1.2822 | 76.0 | 30780 | 6.3201 | 0.1107 | 0.0785 | 0.4348 | 0.2601 |
1.2377 | 77.0 | 31185 | 6.3567 | 0.1114 | 0.0791 | 0.4398 | 0.2604 |
1.218 | 78.0 | 31590 | 6.3813 | 0.1098 | 0.0779 | 0.4306 | 0.2612 |
1.1795 | 79.0 | 31995 | 6.4148 | 0.1123 | 0.0796 | 0.4333 | 0.2596 |
1.1606 | 80.0 | 32400 | 6.4383 | 0.1126 | 0.0809 | 0.4301 | 0.2631 |
1.1258 | 81.0 | 32805 | 6.4793 | 0.1120 | 0.0789 | 0.4335 | 0.2632 |
1.0967 | 82.0 | 33210 | 6.5069 | 0.1107 | 0.0791 | 0.4288 | 0.2644 |
1.0602 | 83.0 | 33615 | 6.5237 | 0.1098 | 0.0786 | 0.4226 | 0.2607 |
1.0511 | 84.0 | 34020 | 6.5441 | 0.1098 | 0.0783 | 0.4232 | 0.2659 |
1.0078 | 85.0 | 34425 | 6.5998 | 0.1114 | 0.0785 | 0.4219 | 0.2664 |
0.9806 | 86.0 | 34830 | 6.6129 | 0.1092 | 0.0780 | 0.4190 | 0.2650 |
0.9755 | 87.0 | 35235 | 6.6448 | 0.1120 | 0.0805 | 0.4223 | 0.2707 |
0.9484 | 88.0 | 35640 | 6.6673 | 0.1098 | 0.0773 | 0.4133 | 0.2669 |
0.9159 | 89.0 | 36045 | 6.6839 | 0.1095 | 0.0779 | 0.4148 | 0.2672 |
0.8817 | 90.0 | 36450 | 6.7131 | 0.1107 | 0.0801 | 0.4184 | 0.2666 |
0.8702 | 91.0 | 36855 | 6.7626 | 0.1101 | 0.0783 | 0.4130 | 0.2691 |
0.8477 | 92.0 | 37260 | 6.7979 | 0.1095 | 0.0781 | 0.4178 | 0.2678 |
0.8351 | 93.0 | 37665 | 6.8089 | 0.1092 | 0.0778 | 0.4153 | 0.2695 |
0.8227 | 94.0 | 38070 | 6.8445 | 0.1104 | 0.0791 | 0.4109 | 0.2649 |
0.7977 | 95.0 | 38475 | 6.8631 | 0.1107 | 0.0784 | 0.4102 | 0.2697 |
0.7908 | 96.0 | 38880 | 6.9042 | 0.1095 | 0.0778 | 0.4135 | 0.2677 |
0.7653 | 97.0 | 39285 | 6.9268 | 0.1086 | 0.0776 | 0.4101 | 0.2680 |
0.7525 | 98.0 | 39690 | 6.9388 | 0.1098 | 0.0782 | 0.4076 | 0.2669 |
0.7314 | 99.0 | 40095 | 6.9618 | 0.1101 | 0.0785 | 0.4106 | 0.2699 |
0.7121 | 100.0 | 40500 | 6.9931 | 0.1101 | 0.0788 | 0.4143 | 0.2665 |
0.7028 | 101.0 | 40905 | 7.0215 | 0.1107 | 0.0781 | 0.4066 | 0.2718 |
0.6917 | 102.0 | 41310 | 7.0317 | 0.1114 | 0.0787 | 0.4095 | 0.2708 |
0.6759 | 103.0 | 41715 | 7.0728 | 0.1114 | 0.0785 | 0.4104 | 0.2690 |
0.6641 | 104.0 | 42120 | 7.0825 | 0.1092 | 0.0775 | 0.4042 | 0.2705 |
0.6486 | 105.0 | 42525 | 7.1051 | 0.1095 | 0.0772 | 0.4013 | 0.2694 |
0.6399 | 106.0 | 42930 | 7.1259 | 0.1083 | 0.0775 | 0.4045 | 0.2683 |
0.6301 | 107.0 | 43335 | 7.1567 | 0.1083 | 0.0772 | 0.4062 | 0.2682 |
0.6115 | 108.0 | 43740 | 7.1774 | 0.1092 | 0.0773 | 0.4058 | 0.2703 |
0.598 | 109.0 | 44145 | 7.2016 | 0.1083 | 0.0765 | 0.3989 | 0.2702 |
0.589 | 110.0 | 44550 | 7.2437 | 0.1076 | 0.0759 | 0.3995 | 0.2684 |
0.5849 | 111.0 | 44955 | 7.2448 | 0.1083 | 0.0772 | 0.4008 | 0.2723 |
0.5715 | 112.0 | 45360 | 7.2590 | 0.1086 | 0.0778 | 0.3984 | 0.2673 |
0.5582 | 113.0 | 45765 | 7.2695 | 0.1107 | 0.0786 | 0.4029 | 0.2705 |
0.5617 | 114.0 | 46170 | 7.3112 | 0.1079 | 0.0754 | 0.3995 | 0.2685 |
0.5429 | 115.0 | 46575 | 7.3109 | 0.1107 | 0.0784 | 0.4008 | 0.2740 |
0.5391 | 116.0 | 46980 | 7.3383 | 0.1089 | 0.0773 | 0.3980 | 0.2731 |
0.5324 | 117.0 | 47385 | 7.3473 | 0.1098 | 0.0780 | 0.3968 | 0.2732 |
0.5198 | 118.0 | 47790 | 7.3698 | 0.1107 | 0.0775 | 0.4004 | 0.2724 |
0.519 | 119.0 | 48195 | 7.3840 | 0.1095 | 0.0781 | 0.4017 | 0.2710 |
0.5119 | 120.0 | 48600 | 7.3899 | 0.1076 | 0.0766 | 0.3932 | 0.2717 |
0.5037 | 121.0 | 49005 | 7.4169 | 0.1098 | 0.0774 | 0.3993 | 0.2749 |
0.4955 | 122.0 | 49410 | 7.4311 | 0.1092 | 0.0770 | 0.3951 | 0.2750 |
0.4858 | 123.0 | 49815 | 7.4523 | 0.1095 | 0.0787 | 0.3979 | 0.2754 |
0.4775 | 124.0 | 50220 | 7.4525 | 0.1104 | 0.0784 | 0.3950 | 0.2764 |
0.4772 | 125.0 | 50625 | 7.4780 | 0.1086 | 0.0769 | 0.3987 | 0.2739 |
0.4783 | 126.0 | 51030 | 7.4837 | 0.1089 | 0.0765 | 0.3956 | 0.2738 |
0.4732 | 127.0 | 51435 | 7.5108 | 0.1083 | 0.0771 | 0.3931 | 0.2740 |
0.4661 | 128.0 | 51840 | 7.5049 | 0.1086 | 0.0767 | 0.3973 | 0.2747 |
0.4619 | 129.0 | 52245 | 7.5264 | 0.1083 | 0.0774 | 0.3979 | 0.2730 |
0.4536 | 130.0 | 52650 | 7.5330 | 0.1086 | 0.0765 | 0.3957 | 0.2740 |
0.4449 | 131.0 | 53055 | 7.5484 | 0.1076 | 0.0762 | 0.3934 | 0.2750 |
0.4466 | 132.0 | 53460 | 7.5424 | 0.1095 | 0.0771 | 0.3979 | 0.2753 |
0.4442 | 133.0 | 53865 | 7.5598 | 0.1086 | 0.0776 | 0.3962 | 0.2759 |
0.447 | 134.0 | 54270 | 7.5688 | 0.1086 | 0.0771 | 0.3959 | 0.2735 |
0.4403 | 135.0 | 54675 | 7.5803 | 0.1073 | 0.0768 | 0.3920 | 0.2722 |
0.4331 | 136.0 | 55080 | 7.5781 | 0.1089 | 0.0762 | 0.3912 | 0.2778 |
0.4261 | 137.0 | 55485 | 7.5949 | 0.1089 | 0.0770 | 0.3939 | 0.2748 |
0.4229 | 138.0 | 55890 | 7.6006 | 0.1086 | 0.0767 | 0.3927 | 0.2755 |
0.4302 | 139.0 | 56295 | 7.5999 | 0.1073 | 0.0757 | 0.3898 | 0.2752 |
0.4165 | 140.0 | 56700 | 7.6029 | 0.1086 | 0.0769 | 0.3882 | 0.2784 |
0.4211 | 141.0 | 57105 | 7.6077 | 0.1089 | 0.0773 | 0.3905 | 0.2767 |
0.4176 | 142.0 | 57510 | 7.6152 | 0.1089 | 0.0775 | 0.3942 | 0.2762 |
0.4118 | 143.0 | 57915 | 7.6272 | 0.1079 | 0.0767 | 0.3913 | 0.2760 |
0.411 | 144.0 | 58320 | 7.6288 | 0.1083 | 0.0774 | 0.3882 | 0.2754 |
0.412 | 145.0 | 58725 | 7.6275 | 0.1070 | 0.0760 | 0.3881 | 0.2755 |
0.4025 | 146.0 | 59130 | 7.6377 | 0.1076 | 0.0762 | 0.3871 | 0.2747 |
0.4091 | 147.0 | 59535 | 7.6391 | 0.1073 | 0.0763 | 0.3906 | 0.2740 |
0.403 | 148.0 | 59940 | 7.6390 | 0.1073 | 0.0760 | 0.3902 | 0.2739 |
0.407 | 149.0 | 60345 | 7.6393 | 0.1073 | 0.0764 | 0.3894 | 0.2740 |
0.4043 | 150.0 | 60750 | 7.6406 | 0.1073 | 0.0763 | 0.3892 | 0.2741 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 3
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for dimitarpg13/distilbert-finetuned-wines-test
Base model
distilbert/distilbert-base-uncased