resnet101-base_tobacco-cnn_tobacco3482_kd_CEKD_t2.5_a0.5

This model is a fine-tuned version of microsoft/resnet-50 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6481
  • Accuracy: 0.69
  • Brier Loss: 0.4919
  • Nll: 2.4969
  • F1 Micro: 0.69
  • F1 Macro: 0.6317
  • Ece: 0.3029
  • Aurc: 0.1260

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy Brier Loss Nll F1 Micro F1 Macro Ece Aurc
No log 1.0 13 1.4796 0.165 0.8965 8.4885 0.165 0.1123 0.2151 0.8341
No log 2.0 26 1.4679 0.165 0.8954 8.3391 0.165 0.1066 0.2136 0.8332
No log 3.0 39 1.4170 0.21 0.8858 6.1941 0.2100 0.0969 0.2433 0.7991
No log 4.0 52 1.3472 0.21 0.8711 6.0602 0.2100 0.0728 0.2320 0.7271
No log 5.0 65 1.2776 0.19 0.8572 6.1293 0.19 0.0537 0.2422 0.7473
No log 6.0 78 1.1840 0.245 0.8353 6.2405 0.245 0.1060 0.2810 0.6690
No log 7.0 91 1.0740 0.365 0.7936 6.3617 0.3650 0.1739 0.3136 0.3646
No log 8.0 104 1.1102 0.345 0.8081 5.8896 0.345 0.1812 0.3046 0.4292
No log 9.0 117 1.0735 0.34 0.7963 5.9970 0.34 0.1842 0.3028 0.4286
No log 10.0 130 1.1145 0.265 0.8110 5.9054 0.265 0.1300 0.2511 0.6350
No log 11.0 143 0.9981 0.325 0.7659 5.3834 0.325 0.1655 0.2790 0.4860
No log 12.0 156 1.0500 0.285 0.7898 4.9696 0.285 0.1594 0.2604 0.6636
No log 13.0 169 0.8764 0.445 0.6976 4.6456 0.445 0.2647 0.2779 0.3020
No log 14.0 182 0.9147 0.48 0.7108 4.4793 0.48 0.2942 0.3262 0.2862
No log 15.0 195 0.9776 0.38 0.7434 4.4065 0.38 0.2269 0.2938 0.5297
No log 16.0 208 0.8066 0.47 0.6494 3.9671 0.47 0.2966 0.2791 0.2907
No log 17.0 221 0.7766 0.535 0.6305 3.5250 0.535 0.3866 0.3003 0.2424
No log 18.0 234 0.8186 0.535 0.6458 3.3670 0.535 0.3792 0.3005 0.2311
No log 19.0 247 0.8156 0.52 0.6430 3.1633 0.52 0.3675 0.3072 0.2667
No log 20.0 260 0.8386 0.55 0.6462 3.2549 0.55 0.4251 0.3103 0.2703
No log 21.0 273 0.7996 0.515 0.6342 3.1396 0.515 0.3969 0.3177 0.2867
No log 22.0 286 0.8605 0.6 0.6472 3.2563 0.6 0.4717 0.3810 0.2113
No log 23.0 299 0.7138 0.595 0.5713 3.1171 0.595 0.4657 0.2773 0.2034
No log 24.0 312 0.7212 0.665 0.5740 2.9688 0.665 0.5474 0.3366 0.1754
No log 25.0 325 0.7463 0.63 0.5843 2.8998 0.63 0.5502 0.3432 0.2072
No log 26.0 338 0.7231 0.67 0.5626 3.1334 0.67 0.5564 0.3160 0.1521
No log 27.0 351 0.6913 0.68 0.5427 2.8906 0.68 0.5702 0.3354 0.1406
No log 28.0 364 0.6825 0.66 0.5342 2.8619 0.66 0.5615 0.2902 0.1625
No log 29.0 377 0.7015 0.665 0.5549 2.7315 0.665 0.5741 0.3305 0.1769
No log 30.0 390 0.6939 0.67 0.5406 2.7114 0.67 0.5720 0.3353 0.1420
No log 31.0 403 0.6836 0.69 0.5265 2.7567 0.69 0.5982 0.3216 0.1455
No log 32.0 416 0.6728 0.69 0.5211 2.6858 0.69 0.6056 0.3124 0.1453
No log 33.0 429 0.6926 0.675 0.5403 2.5815 0.675 0.6095 0.3258 0.1683
No log 34.0 442 0.6673 0.66 0.5090 2.5591 0.66 0.5722 0.2950 0.1385
No log 35.0 455 0.6811 0.675 0.5207 2.5813 0.675 0.5841 0.3324 0.1273
No log 36.0 468 0.6648 0.69 0.5119 2.5745 0.69 0.6225 0.3433 0.1320
No log 37.0 481 0.6623 0.67 0.5092 2.6134 0.67 0.6129 0.3204 0.1471
No log 38.0 494 0.6635 0.69 0.5088 2.3862 0.69 0.6192 0.3201 0.1311
0.7628 39.0 507 0.6554 0.685 0.5008 2.5849 0.685 0.6210 0.3179 0.1377
0.7628 40.0 520 0.6567 0.685 0.5022 2.6498 0.685 0.6310 0.3127 0.1414
0.7628 41.0 533 0.6558 0.695 0.4996 2.5917 0.695 0.6347 0.3115 0.1321
0.7628 42.0 546 0.6578 0.695 0.5021 2.4864 0.695 0.6259 0.3098 0.1306
0.7628 43.0 559 0.6544 0.685 0.4969 2.5757 0.685 0.6175 0.2955 0.1342
0.7628 44.0 572 0.6507 0.685 0.4944 2.5057 0.685 0.6257 0.3144 0.1304
0.7628 45.0 585 0.6501 0.675 0.4937 2.4903 0.675 0.6208 0.3091 0.1301
0.7628 46.0 598 0.6518 0.685 0.4949 2.4732 0.685 0.6254 0.3164 0.1235
0.7628 47.0 611 0.6499 0.685 0.4936 2.4924 0.685 0.6273 0.3124 0.1323
0.7628 48.0 624 0.6490 0.7 0.4925 2.4999 0.7 0.6353 0.3147 0.1243
0.7628 49.0 637 0.6510 0.685 0.4933 2.5758 0.685 0.6242 0.3206 0.1281
0.7628 50.0 650 0.6481 0.69 0.4919 2.4969 0.69 0.6317 0.3029 0.1260

Framework versions

  • Transformers 4.33.3
  • Pytorch 2.2.0.dev20231002
  • Datasets 2.7.1
  • Tokenizers 0.13.3
Downloads last month
75
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for bdpc/resnet101-base_tobacco-cnn_tobacco3482_kd_CEKD_t2.5_a0.5

Finetuned
(134)
this model