Edit model card

swin-tiny-patch4-window7-224-finetuned-st-ucimhar-stacked-tiny

This model is a fine-tuned version of microsoft/swin-tiny-patch4-window7-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2422
  • Accuracy: 0.7833

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 12
  • eval_batch_size: 12
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 48
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 200

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.7491 0.9977 107 1.6778 0.3781
1.3073 1.9953 214 1.1312 0.5299
1.011 2.9930 321 0.9034 0.5961
0.922 4.0 429 0.7951 0.6179
0.8234 4.9977 536 0.6725 0.6709
0.7737 5.9953 643 0.7821 0.6319
0.7946 6.9930 750 0.6644 0.6854
0.8128 8.0 858 0.7123 0.6582
0.715 8.9977 965 0.6771 0.6759
0.7034 9.9953 1072 0.6267 0.7090
0.6079 10.9930 1179 0.6672 0.6895
0.6178 12.0 1287 0.6762 0.6750
0.6561 12.9977 1394 0.5786 0.7085
0.6171 13.9953 1501 0.5703 0.7244
0.6261 14.9930 1608 0.6156 0.7058
0.646 16.0 1716 0.6111 0.7203
0.5834 16.9977 1823 0.5608 0.7407
0.5469 17.9953 1930 0.5682 0.7425
0.5964 18.9930 2037 0.5712 0.7452
0.6095 20.0 2145 0.5856 0.7226
0.5992 20.9977 2252 0.6288 0.7085
0.5572 21.9953 2359 0.6966 0.7049
0.4914 22.9930 2466 0.6251 0.7131
0.5694 24.0 2574 0.5781 0.7407
0.5108 24.9977 2681 0.5544 0.7493
0.4507 25.9953 2788 0.5737 0.7421
0.5308 26.9930 2895 0.5404 0.7529
0.4754 28.0 3003 0.5940 0.7439
0.4562 28.9977 3110 0.5465 0.7507
0.4617 29.9953 3217 0.5298 0.7665
0.4347 30.9930 3324 0.5796 0.7566
0.466 32.0 3432 0.5378 0.7602
0.4335 32.9977 3539 0.5227 0.7724
0.4335 33.9953 3646 0.6071 0.7471
0.492 34.9930 3753 0.5336 0.7715
0.3798 36.0 3861 0.5833 0.7679
0.4314 36.9977 3968 0.5538 0.7747
0.4269 37.9953 4075 0.5880 0.7616
0.3804 38.9930 4182 0.6006 0.7665
0.4089 40.0 4290 0.5728 0.7747
0.3446 40.9977 4397 0.5992 0.7747
0.3786 41.9953 4504 0.5686 0.7706
0.3944 42.9930 4611 0.6180 0.7634
0.3295 44.0 4719 0.5682 0.7743
0.3104 44.9977 4826 0.5924 0.7702
0.3465 45.9953 4933 0.6169 0.7765
0.3611 46.9930 5040 0.5923 0.7829
0.2893 48.0 5148 0.6474 0.7697
0.3345 48.9977 5255 0.7161 0.7575
0.2894 49.9953 5362 0.6239 0.7661
0.2979 50.9930 5469 0.6255 0.7765
0.2933 52.0 5577 0.6235 0.7729
0.3072 52.9977 5684 0.6996 0.7638
0.3179 53.9953 5791 0.6933 0.7792
0.2318 54.9930 5898 0.6738 0.7811
0.2083 56.0 6006 0.6706 0.7792
0.1927 56.9977 6113 0.7160 0.7806
0.2316 57.9953 6220 0.7098 0.7706
0.2582 58.9930 6327 0.7736 0.7656
0.257 60.0 6435 0.6893 0.7879
0.2449 60.9977 6542 0.7491 0.7820
0.2335 61.9953 6649 0.7232 0.7770
0.2251 62.9930 6756 0.7697 0.7733
0.2055 64.0 6864 0.7920 0.7679
0.1984 64.9977 6971 0.7432 0.7788
0.2104 65.9953 7078 0.7694 0.7652
0.2279 66.9930 7185 0.7625 0.7788
0.2356 68.0 7293 0.7823 0.7747
0.2097 68.9977 7400 0.8695 0.7711
0.2393 69.9953 7507 0.7937 0.7770
0.1599 70.9930 7614 0.8057 0.7792
0.1836 72.0 7722 0.7616 0.7883
0.1823 72.9977 7829 0.8693 0.7688
0.1938 73.9953 7936 0.8137 0.7647
0.1821 74.9930 8043 0.8643 0.7688
0.2125 76.0 8151 0.8180 0.7824
0.1872 76.9977 8258 0.9031 0.7684
0.219 77.9953 8365 0.8406 0.7738
0.1639 78.9930 8472 0.8814 0.7720
0.1532 80.0 8580 0.8852 0.7792
0.2049 80.9977 8687 0.8322 0.7788
0.1379 81.9953 8794 0.9790 0.7711
0.145 82.9930 8901 0.9681 0.7620
0.201 84.0 9009 0.9446 0.7801
0.1433 84.9977 9116 0.8740 0.7711
0.127 85.9953 9223 0.8781 0.7833
0.1646 86.9930 9330 0.8880 0.7815
0.1671 88.0 9438 0.9304 0.7733
0.123 88.9977 9545 0.9443 0.7738
0.1261 89.9953 9652 0.9818 0.7638
0.1143 90.9930 9759 0.9140 0.7738
0.1621 92.0 9867 0.8911 0.7733
0.1607 92.9977 9974 0.8875 0.7801
0.1501 93.9953 10081 0.9756 0.7652
0.1313 94.9930 10188 0.9266 0.7724
0.1304 96.0 10296 0.9165 0.7811
0.1178 96.9977 10403 0.8847 0.7806
0.1022 97.9953 10510 0.9896 0.7747
0.1535 98.9930 10617 0.9597 0.7634
0.1235 100.0 10725 1.0785 0.7811
0.1537 100.9977 10832 0.9804 0.7665
0.1034 101.9953 10939 1.0229 0.7747
0.1151 102.9930 11046 0.9565 0.7792
0.1207 104.0 11154 0.9972 0.7752
0.1004 104.9977 11261 1.0217 0.7693
0.1129 105.9953 11368 1.0180 0.7743
0.1461 106.9930 11475 1.0894 0.7634
0.1058 108.0 11583 1.0378 0.7801
0.1182 108.9977 11690 0.9832 0.7756
0.1234 109.9953 11797 0.9694 0.7915
0.0876 110.9930 11904 1.0163 0.7783
0.1114 112.0 12012 1.0190 0.7720
0.1102 112.9977 12119 1.0097 0.7788
0.1341 113.9953 12226 1.0256 0.7797
0.0925 114.9930 12333 1.0942 0.7797
0.099 116.0 12441 1.1353 0.7715
0.0949 116.9977 12548 1.0752 0.7765
0.0999 117.9953 12655 1.0974 0.7765
0.0942 118.9930 12762 1.0812 0.7761
0.1149 120.0 12870 1.0109 0.7824
0.1101 120.9977 12977 1.0238 0.7879
0.0971 121.9953 13084 1.0740 0.7724
0.1146 122.9930 13191 1.0517 0.7788
0.1378 124.0 13299 0.9993 0.7765
0.1491 124.9977 13406 1.0226 0.7797
0.0993 125.9953 13513 1.0196 0.7806
0.1103 126.9930 13620 1.0618 0.7829
0.0628 128.0 13728 1.1314 0.7752
0.125 128.9977 13835 1.0911 0.7806
0.1051 129.9953 13942 1.1129 0.7729
0.07 130.9930 14049 1.1152 0.7774
0.1128 132.0 14157 1.1385 0.7815
0.1186 132.9977 14264 1.0660 0.7915
0.0828 133.9953 14371 1.0861 0.7788
0.081 134.9930 14478 1.0989 0.7783
0.084 136.0 14586 1.0952 0.7770
0.0958 136.9977 14693 1.0558 0.7747
0.0943 137.9953 14800 1.0902 0.7833
0.055 138.9930 14907 1.1308 0.7797
0.0972 140.0 15015 1.0727 0.7842
0.0819 140.9977 15122 1.1066 0.7851
0.0885 141.9953 15229 1.1115 0.7752
0.0769 142.9930 15336 1.0922 0.7788
0.0668 144.0 15444 1.1498 0.7788
0.0836 144.9977 15551 1.1783 0.7729
0.1068 145.9953 15658 1.1379 0.7783
0.0656 146.9930 15765 1.1223 0.7806
0.0815 148.0 15873 1.1083 0.7783
0.06 148.9977 15980 1.1309 0.7783
0.0812 149.9953 16087 1.1336 0.7774
0.0771 150.9930 16194 1.1676 0.7833
0.0867 152.0 16302 1.1745 0.7824
0.075 152.9977 16409 1.1625 0.7779
0.0833 153.9953 16516 1.1656 0.7743
0.0788 154.9930 16623 1.1623 0.7783
0.0647 156.0 16731 1.1728 0.7788
0.0915 156.9977 16838 1.1869 0.7765
0.0814 157.9953 16945 1.1643 0.7824
0.0644 158.9930 17052 1.1955 0.7820
0.0899 160.0 17160 1.2017 0.7811
0.0746 160.9977 17267 1.1638 0.7801
0.0744 161.9953 17374 1.1992 0.7774
0.0868 162.9930 17481 1.2164 0.7824
0.0351 164.0 17589 1.2245 0.7738
0.0586 164.9977 17696 1.1803 0.7829
0.0558 165.9953 17803 1.2584 0.7756
0.0982 166.9930 17910 1.2349 0.7765
0.0711 168.0 18018 1.3022 0.7756
0.0416 168.9977 18125 1.1992 0.7783
0.053 169.9953 18232 1.2331 0.7752
0.0458 170.9930 18339 1.1994 0.7833
0.0698 172.0 18447 1.2182 0.7779
0.0823 172.9977 18554 1.2031 0.7788
0.0705 173.9953 18661 1.1887 0.7874
0.0489 174.9930 18768 1.2278 0.7733
0.0513 176.0 18876 1.2315 0.7824
0.0587 176.9977 18983 1.2252 0.7806
0.0291 177.9953 19090 1.2223 0.7788
0.0549 178.9930 19197 1.2333 0.7797
0.0433 180.0 19305 1.2576 0.7811
0.0859 180.9977 19412 1.2297 0.7820
0.1015 181.9953 19519 1.2171 0.7801
0.0396 182.9930 19626 1.2269 0.7783
0.0892 184.0 19734 1.2359 0.7824
0.0561 184.9977 19841 1.2501 0.7820
0.0391 185.9953 19948 1.2284 0.7869
0.0673 186.9930 20055 1.2562 0.7783
0.0464 188.0 20163 1.2398 0.7815
0.0798 188.9977 20270 1.2727 0.7792
0.0543 189.9953 20377 1.2414 0.7838
0.0421 190.9930 20484 1.2468 0.7811
0.0563 192.0 20592 1.2546 0.7833
0.0638 192.9977 20699 1.2530 0.7824
0.0571 193.9953 20806 1.2487 0.7824
0.05 194.9930 20913 1.2502 0.7797
0.0743 196.0 21021 1.2507 0.7797
0.0282 196.9977 21128 1.2481 0.7820
0.0629 197.9953 21235 1.2416 0.7829
0.0285 198.9930 21342 1.2423 0.7842
0.0475 199.5338 21400 1.2422 0.7833

Framework versions

  • Transformers 4.44.0
  • Pytorch 1.12.1+cu113
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
6
Safetensors
Model size
27.6M params
Tensor type
I64
·
F32
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for ayubkfupm/swin-tiny-patch4-window7-224-finetuned-st-ucimhar-stacked-tiny

Finetuned
(465)
this model

Evaluation results