roberta-finetuned-wines

This model is a fine-tuned version of roberta-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.4708
  • Accuracy: 0.7826
  • F1: 0.6998
  • Precision: 0.8330
  • Recall: 0.7875

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 5
  • num_epochs: 200
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Precision Recall
7.8604 1.0 405 7.8021 0.0046 0.0001 0.9916 0.0025
7.7552 2.0 810 7.6534 0.0084 0.0005 0.9847 0.0053
7.6208 3.0 1215 7.5104 0.0096 0.0004 0.9728 0.0075
7.4786 4.0 1620 7.3656 0.0192 0.0027 0.9642 0.0137
7.3399 5.0 2025 7.2197 0.0235 0.0032 0.9571 0.0175
7.2026 6.0 2430 7.0715 0.0260 0.0044 0.9587 0.0186
7.0616 7.0 2835 6.9360 0.0322 0.0082 0.9483 0.0232
6.9247 8.0 3240 6.8037 0.0328 0.0089 0.9517 0.0231
6.7897 9.0 3645 6.6682 0.0455 0.0129 0.9493 0.0311
6.6593 10.0 4050 6.5377 0.0473 0.0137 0.9381 0.0343
6.5321 11.0 4455 6.4207 0.0517 0.0153 0.9425 0.0340
6.4102 12.0 4860 6.2955 0.0569 0.0205 0.9339 0.0414
6.291 13.0 5265 6.1805 0.0656 0.0270 0.9252 0.0498
6.1789 14.0 5670 6.0748 0.0755 0.0311 0.9258 0.0577
6.0653 15.0 6075 5.9658 0.0795 0.0313 0.9216 0.0597
5.9595 16.0 6480 5.8629 0.0885 0.0388 0.9137 0.0697
5.8568 17.0 6885 5.7689 0.0928 0.0396 0.9113 0.0706
5.7597 18.0 7290 5.6795 0.0965 0.0410 0.9092 0.0731
5.6659 19.0 7695 5.5862 0.1049 0.0461 0.9029 0.0802
5.5714 20.0 8100 5.4965 0.1135 0.0506 0.8985 0.0877
5.484 21.0 8505 5.4127 0.1166 0.0503 0.8954 0.0901
5.3961 22.0 8910 5.3354 0.1250 0.0579 0.8887 0.0987
5.3137 23.0 9315 5.2551 0.1259 0.0571 0.8888 0.0997
5.2296 24.0 9720 5.1792 0.1349 0.0645 0.8856 0.1067
5.1527 25.0 10125 5.1053 0.1457 0.0742 0.8755 0.1203
5.0725 26.0 10530 5.0318 0.1519 0.0750 0.8735 0.1246
4.9995 27.0 10935 4.9612 0.1581 0.0829 0.8711 0.1322
4.9284 28.0 11340 4.8977 0.1661 0.0857 0.8721 0.1347
4.8567 29.0 11745 4.8218 0.1735 0.0937 0.8683 0.1418
4.783 30.0 12150 4.7584 0.1819 0.0971 0.8637 0.1510
4.7193 31.0 12555 4.6947 0.1896 0.1045 0.8617 0.1586
4.6478 32.0 12960 4.6337 0.2020 0.1148 0.8592 0.1727
4.5815 33.0 13365 4.5725 0.2051 0.1148 0.8541 0.1743
4.523 34.0 13770 4.5180 0.2106 0.1214 0.8551 0.1838
4.4537 35.0 14175 4.4538 0.2215 0.1263 0.8512 0.1897
4.3924 36.0 14580 4.4020 0.2230 0.1297 0.8494 0.1935
4.3313 37.0 14985 4.3394 0.2363 0.1408 0.8389 0.2107
4.2718 38.0 15390 4.2840 0.2484 0.1537 0.8427 0.2200
4.2099 39.0 15795 4.2260 0.2614 0.1625 0.8352 0.2358
4.1507 40.0 16200 4.1730 0.2645 0.1676 0.8324 0.2399
4.0893 41.0 16605 4.1179 0.2827 0.1817 0.8301 0.2552
4.0296 42.0 17010 4.0695 0.2836 0.1852 0.8292 0.2610
3.9781 43.0 17415 4.0107 0.3053 0.2016 0.8286 0.2768
3.9233 44.0 17820 3.9595 0.3071 0.2072 0.8179 0.2846
3.8604 45.0 18225 3.9078 0.3195 0.2186 0.8172 0.2980
3.8038 46.0 18630 3.8599 0.3341 0.2277 0.8140 0.3101
3.7492 47.0 19035 3.8101 0.3387 0.2347 0.8154 0.3158
3.6949 48.0 19440 3.7631 0.3542 0.2510 0.8153 0.3319
3.6347 49.0 19845 3.7169 0.3610 0.2535 0.8138 0.3365
3.5873 50.0 20250 3.6666 0.3721 0.2624 0.8083 0.3478
3.5366 51.0 20655 3.6167 0.3848 0.2773 0.8078 0.3634
3.4761 52.0 21060 3.5746 0.3885 0.2799 0.8113 0.3638
3.4254 53.0 21465 3.5332 0.3934 0.2839 0.8057 0.3692
3.3765 54.0 21870 3.4823 0.4105 0.3037 0.8037 0.3910
3.3245 55.0 22275 3.4392 0.4253 0.3176 0.8054 0.4059
3.2754 56.0 22680 3.3974 0.4337 0.3239 0.8034 0.4124
3.2224 57.0 23085 3.3565 0.4408 0.3293 0.7991 0.4177
3.1768 58.0 23490 3.3120 0.4550 0.3429 0.7966 0.4339
3.1202 59.0 23895 3.2699 0.4596 0.3497 0.7995 0.4376
3.0742 60.0 24300 3.2277 0.4692 0.3574 0.7977 0.4497
3.0225 61.0 24705 3.1866 0.4751 0.3627 0.7967 0.4543
2.9801 62.0 25110 3.1455 0.4872 0.3748 0.7943 0.4683
2.9287 63.0 25515 3.1078 0.5014 0.3916 0.7893 0.4848
2.8847 64.0 25920 3.0645 0.5032 0.3909 0.7918 0.4833
2.8325 65.0 26325 3.0316 0.5097 0.3963 0.7925 0.4908
2.7866 66.0 26730 2.9929 0.5280 0.4163 0.7906 0.5092
2.7484 67.0 27135 2.9514 0.5410 0.4302 0.7948 0.5228
2.6964 68.0 27540 2.9139 0.5444 0.4320 0.7966 0.5251
2.653 69.0 27945 2.8757 0.5583 0.4459 0.7941 0.5417
2.6088 70.0 28350 2.8426 0.5605 0.4464 0.7953 0.5415
2.564 71.0 28755 2.8091 0.5725 0.4575 0.7981 0.5481
2.5236 72.0 29160 2.7726 0.5806 0.4725 0.7971 0.5623
2.4776 73.0 29565 2.7376 0.5849 0.4735 0.7967 0.5652
2.4381 74.0 29970 2.7033 0.5939 0.4833 0.7983 0.5764
2.3904 75.0 30375 2.6753 0.6053 0.4958 0.7992 0.5886
2.3502 76.0 30780 2.6403 0.6121 0.5020 0.8015 0.5939
2.3087 77.0 31185 2.6105 0.6174 0.5064 0.8044 0.6001
2.2701 78.0 31590 2.5760 0.6294 0.5233 0.8102 0.6120
2.2276 79.0 31995 2.5457 0.6397 0.5358 0.8074 0.6253
2.1939 80.0 32400 2.5160 0.6443 0.5394 0.8074 0.6302
2.1468 81.0 32805 2.4892 0.6452 0.5398 0.8070 0.6298
2.1113 82.0 33210 2.4566 0.6508 0.5456 0.8080 0.6381
2.0773 83.0 33615 2.4327 0.6539 0.5498 0.8109 0.6382
2.0369 84.0 34020 2.4050 0.6641 0.5625 0.8124 0.6513
1.9986 85.0 34425 2.3793 0.6666 0.5628 0.8116 0.6521
1.9656 86.0 34830 2.3470 0.6709 0.5689 0.8106 0.6579
1.9329 87.0 35235 2.3257 0.6740 0.5718 0.8107 0.6623
1.8962 88.0 35640 2.2960 0.6823 0.5810 0.8145 0.6697
1.8577 89.0 36045 2.2736 0.6898 0.5873 0.8150 0.6791
1.8192 90.0 36450 2.2484 0.6910 0.5887 0.8149 0.6808
1.7895 91.0 36855 2.2235 0.6966 0.5961 0.8153 0.6873
1.7584 92.0 37260 2.2028 0.6944 0.5986 0.8180 0.6850
1.7229 93.0 37665 2.1772 0.7031 0.6063 0.8165 0.6949
1.6962 94.0 38070 2.1558 0.7065 0.6074 0.8195 0.6969
1.6626 95.0 38475 2.1385 0.7077 0.6104 0.8230 0.7005
1.6304 96.0 38880 2.1168 0.7148 0.6172 0.8210 0.7075
1.597 97.0 39285 2.0962 0.7176 0.6237 0.8235 0.7114
1.5736 98.0 39690 2.0748 0.7210 0.6266 0.8215 0.7160
1.5383 99.0 40095 2.0569 0.7250 0.6321 0.8251 0.7197
1.5096 100.0 40500 2.0302 0.7294 0.6350 0.8228 0.7243
1.4847 101.0 40905 2.0159 0.7315 0.6380 0.8261 0.7244
1.4569 102.0 41310 2.0005 0.7318 0.6419 0.8261 0.7274
1.426 103.0 41715 1.9833 0.7340 0.6448 0.8291 0.7291
1.401 104.0 42120 1.9616 0.7408 0.6512 0.8283 0.7369
1.3718 105.0 42525 1.9425 0.7427 0.6502 0.8281 0.7391
1.3496 106.0 42930 1.9275 0.7442 0.6533 0.8273 0.7411
1.3248 107.0 43335 1.9164 0.7420 0.6533 0.8296 0.7389
1.2936 108.0 43740 1.8988 0.7485 0.6554 0.8257 0.7461
1.2709 109.0 44145 1.8812 0.7488 0.6614 0.8268 0.7472
1.2543 110.0 44550 1.8682 0.7504 0.6617 0.8276 0.7499
1.2273 111.0 44955 1.8524 0.7532 0.6612 0.8276 0.7524
1.2041 112.0 45360 1.8357 0.7535 0.6641 0.8307 0.7512
1.1821 113.0 45765 1.8244 0.7563 0.6678 0.8299 0.7546
1.1605 114.0 46170 1.8137 0.7550 0.6658 0.8269 0.7533
1.138 115.0 46575 1.7954 0.7581 0.6714 0.8317 0.7563
1.1139 116.0 46980 1.7860 0.7590 0.6711 0.8292 0.7586
1.095 117.0 47385 1.7778 0.7609 0.6716 0.8295 0.7593
1.0755 118.0 47790 1.7636 0.7615 0.6726 0.8320 0.7597
1.0568 119.0 48195 1.7544 0.7615 0.6715 0.8304 0.7594
1.0381 120.0 48600 1.7403 0.7624 0.6730 0.8293 0.7606
1.0219 121.0 49005 1.7280 0.7621 0.6725 0.8293 0.7615
1.0027 122.0 49410 1.7173 0.7637 0.6751 0.8309 0.7619
0.982 123.0 49815 1.7084 0.7631 0.6738 0.8286 0.7625
0.9653 124.0 50220 1.7028 0.7637 0.6766 0.8305 0.7643
0.9514 125.0 50625 1.6931 0.7637 0.6747 0.8267 0.7638
0.9339 126.0 51030 1.6833 0.7649 0.6759 0.8266 0.7652
0.9179 127.0 51435 1.6746 0.7655 0.6780 0.8293 0.7652
0.9026 128.0 51840 1.6650 0.7643 0.6767 0.8264 0.7650
0.8864 129.0 52245 1.6575 0.7652 0.6764 0.8286 0.7653
0.8695 130.0 52650 1.6465 0.7674 0.6781 0.8250 0.7671
0.8548 131.0 53055 1.6428 0.7674 0.6799 0.8277 0.7692
0.8407 132.0 53460 1.6348 0.7662 0.6791 0.8269 0.7681
0.8302 133.0 53865 1.6304 0.7665 0.6776 0.8260 0.7678
0.816 134.0 54270 1.6201 0.7696 0.6812 0.8271 0.7718
0.8042 135.0 54675 1.6137 0.7699 0.6841 0.8280 0.7725
0.7889 136.0 55080 1.6066 0.7714 0.6839 0.8275 0.7753
0.7737 137.0 55485 1.6014 0.7736 0.6907 0.8285 0.7768
0.7639 138.0 55890 1.5958 0.7708 0.6879 0.8285 0.7742
0.7531 139.0 56295 1.5896 0.7720 0.6875 0.8302 0.7751
0.7447 140.0 56700 1.5849 0.7761 0.6931 0.8301 0.7794
0.7321 141.0 57105 1.5799 0.7717 0.6870 0.8301 0.7755
0.7172 142.0 57510 1.5744 0.7717 0.6865 0.8288 0.7753
0.7069 143.0 57915 1.5718 0.7736 0.6914 0.8315 0.7776
0.7006 144.0 58320 1.5667 0.7754 0.6889 0.8274 0.7805
0.6878 145.0 58725 1.5621 0.7758 0.6909 0.8285 0.7797
0.6781 146.0 59130 1.5597 0.7770 0.6917 0.8304 0.7807
0.6685 147.0 59535 1.5567 0.7751 0.6914 0.8280 0.7807
0.659 148.0 59940 1.5520 0.7758 0.6908 0.8300 0.7809
0.6563 149.0 60345 1.5482 0.7767 0.6912 0.8304 0.7815
0.6433 150.0 60750 1.5440 0.7745 0.6894 0.8287 0.7803
0.6343 151.0 61155 1.5383 0.7764 0.6907 0.8307 0.7804
0.6251 152.0 61560 1.5365 0.7773 0.6920 0.8310 0.7817
0.6171 153.0 61965 1.5329 0.7776 0.6934 0.8313 0.7823
0.6112 154.0 62370 1.5278 0.7792 0.6929 0.8304 0.7832
0.6017 155.0 62775 1.5239 0.7795 0.6941 0.8308 0.7843
0.5956 156.0 63180 1.5230 0.7795 0.6955 0.8313 0.7852
0.5865 157.0 63585 1.5181 0.7795 0.6968 0.8318 0.7855
0.5826 158.0 63990 1.5172 0.7779 0.6951 0.8314 0.7840
0.5755 159.0 64395 1.5160 0.7779 0.6956 0.8315 0.7828
0.5672 160.0 64800 1.5119 0.7782 0.6927 0.8294 0.7827
0.5625 161.0 65205 1.5083 0.7798 0.6962 0.8314 0.7843
0.5587 162.0 65610 1.5078 0.7792 0.6954 0.8288 0.7860
0.5496 163.0 66015 1.5044 0.7792 0.6952 0.8315 0.7846
0.5456 164.0 66420 1.5009 0.7801 0.6965 0.8313 0.7853
0.5368 165.0 66825 1.5002 0.7801 0.6967 0.8316 0.7845
0.5374 166.0 67230 1.4984 0.7801 0.6965 0.8312 0.7848
0.5277 167.0 67635 1.4951 0.7822 0.7000 0.8342 0.7868
0.5273 168.0 68040 1.4969 0.7813 0.6992 0.8317 0.7860
0.5184 169.0 68445 1.4934 0.7813 0.6977 0.8319 0.7864
0.5143 170.0 68850 1.4894 0.7819 0.6988 0.8324 0.7876
0.5116 171.0 69255 1.4901 0.7829 0.7000 0.8346 0.7879
0.5059 172.0 69660 1.4893 0.7822 0.7003 0.8331 0.7880
0.5043 173.0 70065 1.4877 0.7816 0.6974 0.8312 0.7870
0.4958 174.0 70470 1.4858 0.7807 0.6968 0.8317 0.7865
0.4936 175.0 70875 1.4854 0.7826 0.6991 0.8324 0.7870
0.4923 176.0 71280 1.4841 0.7810 0.6977 0.8322 0.7862
0.4866 177.0 71685 1.4830 0.7807 0.7004 0.8339 0.7861
0.4851 178.0 72090 1.4826 0.7798 0.6974 0.8319 0.7854
0.4795 179.0 72495 1.4816 0.7816 0.6985 0.8317 0.7870
0.4772 180.0 72900 1.4789 0.7798 0.6985 0.8327 0.7855
0.4765 181.0 73305 1.4805 0.7810 0.6975 0.8320 0.7858
0.4734 182.0 73710 1.4777 0.7822 0.6997 0.8337 0.7867
0.4702 183.0 74115 1.4776 0.7822 0.6988 0.8316 0.7879
0.4717 184.0 74520 1.4765 0.7822 0.6986 0.8332 0.7875
0.4658 185.0 74925 1.4764 0.7816 0.6982 0.8329 0.7866
0.4642 186.0 75330 1.4758 0.7826 0.6992 0.8331 0.7877
0.462 187.0 75735 1.4746 0.7829 0.7001 0.8322 0.7888
0.4599 188.0 76140 1.4740 0.7822 0.6986 0.8308 0.7881
0.4586 189.0 76545 1.4744 0.7832 0.7002 0.8324 0.7887
0.4554 190.0 76950 1.4738 0.7826 0.7001 0.8328 0.7882
0.4539 191.0 77355 1.4731 0.7822 0.6988 0.8303 0.7881
0.4525 192.0 77760 1.4724 0.7829 0.7010 0.8328 0.7879
0.4483 193.0 78165 1.4722 0.7829 0.6989 0.8315 0.7888
0.4469 194.0 78570 1.4715 0.7829 0.6995 0.8313 0.7885
0.447 195.0 78975 1.4716 0.7835 0.6997 0.8322 0.7886
0.4484 196.0 79380 1.4712 0.7826 0.6994 0.8324 0.7878
0.4459 197.0 79785 1.4707 0.7832 0.7003 0.8329 0.7882
0.4455 198.0 80190 1.4710 0.7826 0.7001 0.8333 0.7878
0.4457 199.0 80595 1.4709 0.7829 0.7000 0.8326 0.7882
0.4422 200.0 81000 1.4708 0.7826 0.6998 0.8330 0.7875

Framework versions

  • Transformers 4.51.3
  • Pytorch 2.6.0+cu124
  • Datasets 3.5.0
  • Tokenizers 0.21.1
Downloads last month
3
Safetensors
Model size
127M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for dimitarpg13/roberta-finetuned-wines-resampled-val-ds

Finetuned
(1730)
this model