resnet101-base_tobacco-cnn_tobacco3482_kd_CEKD_t1.5_a0.9

This model is a fine-tuned version of microsoft/resnet-50 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8831
  • Accuracy: 0.695
  • Brier Loss: 0.4126
  • Nll: 2.4628
  • F1 Micro: 0.695
  • F1 Macro: 0.6387
  • Ece: 0.2426
  • Aurc: 0.1068

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy Brier Loss Nll F1 Micro F1 Macro Ece Aurc
No log 1.0 13 2.1233 0.16 0.8967 8.5697 0.16 0.1066 0.2078 0.8316
No log 2.0 26 2.1188 0.14 0.8961 8.2960 0.14 0.0886 0.1947 0.8419
No log 3.0 39 2.0764 0.195 0.8873 6.4713 0.195 0.1159 0.2335 0.7665
No log 4.0 52 2.0365 0.21 0.8787 5.7752 0.2100 0.0930 0.2376 0.7548
No log 5.0 65 1.9888 0.2 0.8682 5.8737 0.2000 0.0775 0.2417 0.7314
No log 6.0 78 1.8998 0.215 0.8465 5.8553 0.2150 0.0970 0.2586 0.7063
No log 7.0 91 1.8351 0.33 0.8289 5.7781 0.33 0.1904 0.3089 0.6103
No log 8.0 104 1.7342 0.4 0.7968 5.5366 0.4000 0.2476 0.3457 0.4276
No log 9.0 117 1.6787 0.36 0.7757 5.7414 0.36 0.2148 0.3062 0.4324
No log 10.0 130 1.6942 0.4 0.7870 5.2615 0.4000 0.2831 0.3168 0.5227
No log 11.0 143 1.5992 0.4 0.7489 4.7833 0.4000 0.2649 0.3053 0.4679
No log 12.0 156 1.6071 0.425 0.7532 4.2803 0.425 0.2906 0.3196 0.4646
No log 13.0 169 1.4727 0.48 0.6925 4.1911 0.48 0.3239 0.2957 0.3081
No log 14.0 182 1.4275 0.515 0.6705 3.7980 0.515 0.3569 0.3211 0.2626
No log 15.0 195 1.3282 0.56 0.6200 3.6359 0.56 0.4163 0.2990 0.2213
No log 16.0 208 1.3280 0.565 0.6263 3.4960 0.565 0.4177 0.3217 0.2346
No log 17.0 221 1.3220 0.595 0.6196 3.2202 0.595 0.4639 0.3322 0.1992
No log 18.0 234 1.2359 0.595 0.5840 3.3332 0.595 0.4780 0.3042 0.2011
No log 19.0 247 1.1690 0.625 0.5531 3.2423 0.625 0.5233 0.2940 0.1807
No log 20.0 260 1.1644 0.64 0.5532 3.0542 0.64 0.5429 0.3019 0.1821
No log 21.0 273 1.1611 0.62 0.5516 2.9412 0.62 0.5193 0.2865 0.2160
No log 22.0 286 1.3427 0.585 0.6361 3.0936 0.585 0.5089 0.3442 0.2922
No log 23.0 299 1.1238 0.62 0.5440 2.7924 0.62 0.5458 0.2654 0.2088
No log 24.0 312 1.2008 0.685 0.5615 2.5918 0.685 0.5890 0.3907 0.1516
No log 25.0 325 1.0764 0.695 0.5000 2.6354 0.695 0.6107 0.3126 0.1397
No log 26.0 338 1.0268 0.675 0.4822 2.4798 0.675 0.5992 0.2775 0.1229
No log 27.0 351 1.0340 0.67 0.4893 2.4316 0.67 0.5997 0.2763 0.1638
No log 28.0 364 1.0154 0.665 0.4769 2.6487 0.665 0.6034 0.2590 0.1487
No log 29.0 377 1.0013 0.64 0.4814 2.5899 0.64 0.5771 0.2429 0.1593
No log 30.0 390 1.0173 0.685 0.4714 2.6922 0.685 0.6178 0.2898 0.1423
No log 31.0 403 0.9630 0.695 0.4509 2.6349 0.695 0.6206 0.2746 0.1248
No log 32.0 416 0.9950 0.68 0.4648 2.4144 0.68 0.6362 0.3020 0.1725
No log 33.0 429 0.9711 0.72 0.4502 2.6651 0.72 0.6571 0.2892 0.1268
No log 34.0 442 0.9491 0.705 0.4425 2.7169 0.705 0.6425 0.2541 0.1145
No log 35.0 455 0.9213 0.685 0.4309 2.5736 0.685 0.6174 0.2380 0.1161
No log 36.0 468 0.9144 0.695 0.4297 2.4141 0.695 0.6308 0.2502 0.1154
No log 37.0 481 0.9242 0.715 0.4264 2.7191 0.715 0.6429 0.2386 0.1030
No log 38.0 494 0.9290 0.695 0.4346 2.6515 0.695 0.6367 0.2432 0.1189
1.0953 39.0 507 0.9110 0.69 0.4262 2.6615 0.69 0.6328 0.2368 0.1112
1.0953 40.0 520 0.9000 0.695 0.4186 2.4590 0.695 0.6417 0.2453 0.1070
1.0953 41.0 533 0.8961 0.69 0.4189 2.4170 0.69 0.6368 0.2349 0.1090
1.0953 42.0 546 0.9103 0.675 0.4286 2.6129 0.675 0.6193 0.2318 0.1190
1.0953 43.0 559 0.8858 0.715 0.4131 2.5243 0.715 0.6517 0.2462 0.1053
1.0953 44.0 572 0.8872 0.705 0.4135 2.3272 0.705 0.6542 0.2596 0.1051
1.0953 45.0 585 0.8897 0.715 0.4136 2.3788 0.715 0.6532 0.2560 0.1035
1.0953 46.0 598 0.8842 0.7 0.4117 2.5325 0.7 0.6446 0.2327 0.1075
1.0953 47.0 611 0.8857 0.675 0.4141 2.5451 0.675 0.6203 0.2473 0.1125
1.0953 48.0 624 0.8875 0.69 0.4164 2.4696 0.69 0.6352 0.2542 0.1109
1.0953 49.0 637 0.8842 0.69 0.4153 2.5338 0.69 0.6358 0.2302 0.1112
1.0953 50.0 650 0.8831 0.695 0.4126 2.4628 0.695 0.6387 0.2426 0.1068

Framework versions

  • Transformers 4.33.3
  • Pytorch 2.2.0.dev20231002
  • Datasets 2.7.1
  • Tokenizers 0.13.3
Downloads last month
78
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for bdpc/resnet101-base_tobacco-cnn_tobacco3482_kd_CEKD_t1.5_a0.9

Finetuned
(134)
this model