Augusto777 commited on
Commit
b1b67f4
·
verified ·
1 Parent(s): deac22c

Model save

Browse files
README.md ADDED
@@ -0,0 +1,195 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ license: apache-2.0
4
+ base_model: microsoft/swinv2-tiny-patch4-window8-256
5
+ tags:
6
+ - generated_from_trainer
7
+ datasets:
8
+ - imagefolder
9
+ metrics:
10
+ - accuracy
11
+ model-index:
12
+ - name: swinv2-tiny-patch4-window8-256-DMAE-da3-colab
13
+ results:
14
+ - task:
15
+ name: Image Classification
16
+ type: image-classification
17
+ dataset:
18
+ name: imagefolder
19
+ type: imagefolder
20
+ config: default
21
+ split: validation
22
+ args: default
23
+ metrics:
24
+ - name: Accuracy
25
+ type: accuracy
26
+ value: 0.21739130434782608
27
+ ---
28
+
29
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
30
+ should probably proofread and complete it, then remove this comment. -->
31
+
32
+ # swinv2-tiny-patch4-window8-256-DMAE-da3-colab
33
+
34
+ This model is a fine-tuned version of [microsoft/swinv2-tiny-patch4-window8-256](https://huggingface.co/microsoft/swinv2-tiny-patch4-window8-256) on the imagefolder dataset.
35
+ It achieves the following results on the evaluation set:
36
+ - Loss: 1.4532
37
+ - Accuracy: 0.2174
38
+
39
+ ## Model description
40
+
41
+ More information needed
42
+
43
+ ## Intended uses & limitations
44
+
45
+ More information needed
46
+
47
+ ## Training and evaluation data
48
+
49
+ More information needed
50
+
51
+ ## Training procedure
52
+
53
+ ### Training hyperparameters
54
+
55
+ The following hyperparameters were used during training:
56
+ - learning_rate: 0.001
57
+ - train_batch_size: 16
58
+ - eval_batch_size: 16
59
+ - seed: 42
60
+ - gradient_accumulation_steps: 4
61
+ - total_train_batch_size: 64
62
+ - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
63
+ - lr_scheduler_type: linear
64
+ - num_epochs: 120
65
+
66
+ ### Training results
67
+
68
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
69
+ |:-------------:|:--------:|:----:|:---------------:|:--------:|
70
+ | 1.3523 | 0.9778 | 22 | 1.4024 | 0.3261 |
71
+ | 1.3805 | 2.0 | 45 | 1.3775 | 0.2609 |
72
+ | 1.3221 | 2.9778 | 67 | 1.4419 | 0.3043 |
73
+ | 1.297 | 4.0 | 90 | 1.3582 | 0.3261 |
74
+ | 1.353 | 4.9778 | 112 | 1.3406 | 0.3478 |
75
+ | 1.2627 | 6.0 | 135 | 1.3824 | 0.1522 |
76
+ | 1.3006 | 6.9778 | 157 | 1.4008 | 0.1522 |
77
+ | 1.2438 | 8.0 | 180 | 1.3769 | 0.3261 |
78
+ | 1.222 | 8.9778 | 202 | 1.4212 | 0.3043 |
79
+ | 1.2221 | 10.0 | 225 | 1.4223 | 0.2391 |
80
+ | 1.2262 | 10.9778 | 247 | 1.4154 | 0.2609 |
81
+ | 1.2381 | 12.0 | 270 | 1.3327 | 0.2391 |
82
+ | 1.227 | 12.9778 | 292 | 1.2887 | 0.2826 |
83
+ | 1.2158 | 14.0 | 315 | 1.3465 | 0.2609 |
84
+ | 1.2174 | 14.9778 | 337 | 1.3476 | 0.3043 |
85
+ | 1.1767 | 16.0 | 360 | 1.4024 | 0.1957 |
86
+ | 1.2067 | 16.9778 | 382 | 1.3664 | 0.1739 |
87
+ | 1.2303 | 18.0 | 405 | 1.4260 | 0.2826 |
88
+ | 1.222 | 18.9778 | 427 | 1.4807 | 0.1739 |
89
+ | 1.2026 | 20.0 | 450 | 1.3851 | 0.1739 |
90
+ | 1.2185 | 20.9778 | 472 | 1.3214 | 0.2609 |
91
+ | 1.2773 | 22.0 | 495 | 1.4404 | 0.1957 |
92
+ | 1.227 | 22.9778 | 517 | 1.4535 | 0.2391 |
93
+ | 1.2032 | 24.0 | 540 | 1.3967 | 0.3043 |
94
+ | 1.2223 | 24.9778 | 562 | 1.4090 | 0.3261 |
95
+ | 1.2527 | 26.0 | 585 | 1.4858 | 0.2609 |
96
+ | 1.2203 | 26.9778 | 607 | 1.4366 | 0.1739 |
97
+ | 1.1993 | 28.0 | 630 | 1.4056 | 0.2609 |
98
+ | 1.2014 | 28.9778 | 652 | 1.3755 | 0.3043 |
99
+ | 1.2027 | 30.0 | 675 | 1.4579 | 0.2609 |
100
+ | 1.1961 | 30.9778 | 697 | 1.4524 | 0.2609 |
101
+ | 1.1939 | 32.0 | 720 | 1.4488 | 0.2391 |
102
+ | 1.1889 | 32.9778 | 742 | 1.4568 | 0.1522 |
103
+ | 1.1871 | 34.0 | 765 | 1.3814 | 0.3261 |
104
+ | 1.1778 | 34.9778 | 787 | 1.4403 | 0.1304 |
105
+ | 1.2404 | 36.0 | 810 | 1.4437 | 0.1957 |
106
+ | 1.197 | 36.9778 | 832 | 1.4765 | 0.2174 |
107
+ | 1.2161 | 38.0 | 855 | 1.3720 | 0.2391 |
108
+ | 1.221 | 38.9778 | 877 | 1.3750 | 0.3478 |
109
+ | 1.229 | 40.0 | 900 | 1.3405 | 0.2391 |
110
+ | 1.2046 | 40.9778 | 922 | 1.4231 | 0.2609 |
111
+ | 1.2077 | 42.0 | 945 | 1.4384 | 0.2391 |
112
+ | 1.1865 | 42.9778 | 967 | 1.4346 | 0.2609 |
113
+ | 1.1882 | 44.0 | 990 | 1.3679 | 0.2826 |
114
+ | 1.2528 | 44.9778 | 1012 | 1.3451 | 0.2174 |
115
+ | 1.1836 | 46.0 | 1035 | 1.4913 | 0.2391 |
116
+ | 1.2009 | 46.9778 | 1057 | 1.4841 | 0.3261 |
117
+ | 1.203 | 48.0 | 1080 | 1.4326 | 0.3043 |
118
+ | 1.1679 | 48.9778 | 1102 | 1.3935 | 0.3043 |
119
+ | 1.179 | 50.0 | 1125 | 1.4185 | 0.1957 |
120
+ | 1.1687 | 50.9778 | 1147 | 1.3686 | 0.2826 |
121
+ | 1.1779 | 52.0 | 1170 | 1.4319 | 0.1957 |
122
+ | 1.1566 | 52.9778 | 1192 | 1.3801 | 0.1957 |
123
+ | 1.192 | 54.0 | 1215 | 1.3746 | 0.2174 |
124
+ | 1.1803 | 54.9778 | 1237 | 1.4017 | 0.1957 |
125
+ | 1.194 | 56.0 | 1260 | 1.4288 | 0.1957 |
126
+ | 1.1486 | 56.9778 | 1282 | 1.3920 | 0.3043 |
127
+ | 1.1429 | 58.0 | 1305 | 1.4616 | 0.2391 |
128
+ | 1.1655 | 58.9778 | 1327 | 1.4119 | 0.2174 |
129
+ | 1.1697 | 60.0 | 1350 | 1.3812 | 0.2609 |
130
+ | 1.1898 | 60.9778 | 1372 | 1.4009 | 0.2391 |
131
+ | 1.1882 | 62.0 | 1395 | 1.4221 | 0.2391 |
132
+ | 1.134 | 62.9778 | 1417 | 1.6190 | 0.2609 |
133
+ | 1.1748 | 64.0 | 1440 | 1.4336 | 0.2391 |
134
+ | 1.1439 | 64.9778 | 1462 | 1.3744 | 0.1957 |
135
+ | 1.1585 | 66.0 | 1485 | 1.3992 | 0.3696 |
136
+ | 1.1344 | 66.9778 | 1507 | 1.3952 | 0.2391 |
137
+ | 1.1374 | 68.0 | 1530 | 1.3666 | 0.2174 |
138
+ | 1.1252 | 68.9778 | 1552 | 1.3705 | 0.2826 |
139
+ | 1.1339 | 70.0 | 1575 | 1.3983 | 0.2826 |
140
+ | 1.1344 | 70.9778 | 1597 | 1.3792 | 0.3043 |
141
+ | 1.1343 | 72.0 | 1620 | 1.4467 | 0.2826 |
142
+ | 1.1555 | 72.9778 | 1642 | 1.4823 | 0.2174 |
143
+ | 1.1329 | 74.0 | 1665 | 1.5136 | 0.1522 |
144
+ | 1.1513 | 74.9778 | 1687 | 1.4791 | 0.2391 |
145
+ | 1.1278 | 76.0 | 1710 | 1.4527 | 0.2609 |
146
+ | 1.0956 | 76.9778 | 1732 | 1.4840 | 0.2391 |
147
+ | 1.1131 | 78.0 | 1755 | 1.4900 | 0.2174 |
148
+ | 1.1376 | 78.9778 | 1777 | 1.5395 | 0.2174 |
149
+ | 1.0883 | 80.0 | 1800 | 1.5038 | 0.1957 |
150
+ | 1.1017 | 80.9778 | 1822 | 1.5392 | 0.1957 |
151
+ | 1.1608 | 82.0 | 1845 | 1.4875 | 0.2174 |
152
+ | 1.1308 | 82.9778 | 1867 | 1.5080 | 0.1957 |
153
+ | 1.1382 | 84.0 | 1890 | 1.4835 | 0.1739 |
154
+ | 1.1195 | 84.9778 | 1912 | 1.4076 | 0.1957 |
155
+ | 1.1149 | 86.0 | 1935 | 1.4840 | 0.1739 |
156
+ | 1.1344 | 86.9778 | 1957 | 1.4733 | 0.1957 |
157
+ | 1.1268 | 88.0 | 1980 | 1.4446 | 0.2391 |
158
+ | 1.1267 | 88.9778 | 2002 | 1.4360 | 0.2174 |
159
+ | 1.1034 | 90.0 | 2025 | 1.4329 | 0.1522 |
160
+ | 1.1113 | 90.9778 | 2047 | 1.4670 | 0.1739 |
161
+ | 1.0957 | 92.0 | 2070 | 1.4802 | 0.2391 |
162
+ | 1.1227 | 92.9778 | 2092 | 1.4715 | 0.1739 |
163
+ | 1.1083 | 94.0 | 2115 | 1.4813 | 0.1957 |
164
+ | 1.0583 | 94.9778 | 2137 | 1.5203 | 0.1957 |
165
+ | 1.093 | 96.0 | 2160 | 1.5394 | 0.1739 |
166
+ | 1.0809 | 96.9778 | 2182 | 1.4620 | 0.1739 |
167
+ | 1.0888 | 98.0 | 2205 | 1.4407 | 0.1739 |
168
+ | 1.1292 | 98.9778 | 2227 | 1.4578 | 0.1957 |
169
+ | 1.0754 | 100.0 | 2250 | 1.5031 | 0.1739 |
170
+ | 1.0817 | 100.9778 | 2272 | 1.4461 | 0.2174 |
171
+ | 1.0671 | 102.0 | 2295 | 1.4723 | 0.2391 |
172
+ | 1.0815 | 102.9778 | 2317 | 1.4989 | 0.1957 |
173
+ | 1.0967 | 104.0 | 2340 | 1.4654 | 0.2174 |
174
+ | 1.091 | 104.9778 | 2362 | 1.4559 | 0.2174 |
175
+ | 1.0895 | 106.0 | 2385 | 1.4221 | 0.2826 |
176
+ | 1.0847 | 106.9778 | 2407 | 1.4293 | 0.2826 |
177
+ | 1.102 | 108.0 | 2430 | 1.4582 | 0.2391 |
178
+ | 1.0404 | 108.9778 | 2452 | 1.4656 | 0.2174 |
179
+ | 1.0488 | 110.0 | 2475 | 1.4890 | 0.2174 |
180
+ | 1.0966 | 110.9778 | 2497 | 1.4632 | 0.2174 |
181
+ | 1.0901 | 112.0 | 2520 | 1.4495 | 0.2174 |
182
+ | 1.1008 | 112.9778 | 2542 | 1.4333 | 0.2174 |
183
+ | 1.0884 | 114.0 | 2565 | 1.4406 | 0.2174 |
184
+ | 1.0889 | 114.9778 | 2587 | 1.4474 | 0.2174 |
185
+ | 1.0729 | 116.0 | 2610 | 1.4561 | 0.2174 |
186
+ | 1.0671 | 116.9778 | 2632 | 1.4538 | 0.2174 |
187
+ | 1.0937 | 117.3333 | 2640 | 1.4532 | 0.2174 |
188
+
189
+
190
+ ### Framework versions
191
+
192
+ - Transformers 4.46.3
193
+ - Pytorch 2.5.1+cu121
194
+ - Datasets 3.1.0
195
+ - Tokenizers 0.20.3
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:4f68583fd1e8e559c8a66e07e37d45e518256f791c57de553aeb9b676baf8152
3
  size 110356296
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d8e6941044ddcf1b6a2d6cc2f49d647a8822a5cda5b188719f3579e185863a4b
3
  size 110356296
runs/Dec05_23-48-52_8fb6626b3f6f/events.out.tfevents.1733442536.8fb6626b3f6f.752.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:eebec5725ca6f118558863ca4cc28cf309aa1a95eaae2e98adc27c896347b880
3
- size 98873
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3dde51c9682268e7227de459d659f4a58acab2b16bc97c8db5d14ba165e128f5
3
+ size 99761