jordyvl commited on
Commit
d8e7776
·
1 Parent(s): 60ae308

Saving best model to hub

Browse files
README.md ADDED
@@ -0,0 +1,116 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: bdpc/resnet101-base_tobacco
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - accuracy
8
+ model-index:
9
+ - name: resnet101-base_tobacco-cnn_tobacco3482_kd_CEKD_t1.0_a0.5
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # resnet101-base_tobacco-cnn_tobacco3482_kd_CEKD_t1.0_a0.5
17
+
18
+ This model is a fine-tuned version of [bdpc/resnet101-base_tobacco](https://huggingface.co/bdpc/resnet101-base_tobacco) on the None dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 0.8577
21
+ - Accuracy: 0.53
22
+ - Brier Loss: 0.6406
23
+ - Nll: 2.1208
24
+ - F1 Micro: 0.53
25
+ - F1 Macro: 0.4957
26
+ - Ece: 0.3004
27
+ - Aurc: 0.3168
28
+
29
+ ## Model description
30
+
31
+ More information needed
32
+
33
+ ## Intended uses & limitations
34
+
35
+ More information needed
36
+
37
+ ## Training and evaluation data
38
+
39
+ More information needed
40
+
41
+ ## Training procedure
42
+
43
+ ### Training hyperparameters
44
+
45
+ The following hyperparameters were used during training:
46
+ - learning_rate: 0.0001
47
+ - train_batch_size: 256
48
+ - eval_batch_size: 256
49
+ - seed: 42
50
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
51
+ - lr_scheduler_type: linear
52
+ - lr_scheduler_warmup_ratio: 0.1
53
+ - num_epochs: 50
54
+
55
+ ### Training results
56
+
57
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
58
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:-------:|:--------:|:--------:|:------:|:------:|
59
+ | No log | 1.0 | 4 | 1.4267 | 0.05 | 0.9008 | 9.6592 | 0.0500 | 0.0177 | 0.1432 | 0.9439 |
60
+ | No log | 2.0 | 8 | 1.4006 | 0.155 | 0.8969 | 7.9140 | 0.155 | 0.0268 | 0.2365 | 0.9603 |
61
+ | No log | 3.0 | 12 | 1.4621 | 0.155 | 0.9457 | 13.3695 | 0.155 | 0.0268 | 0.3013 | 0.9107 |
62
+ | No log | 4.0 | 16 | 2.1836 | 0.155 | 1.3252 | 12.8977 | 0.155 | 0.0268 | 0.6400 | 0.7514 |
63
+ | No log | 5.0 | 20 | 2.4365 | 0.155 | 1.3998 | 8.4435 | 0.155 | 0.0268 | 0.7030 | 0.6102 |
64
+ | No log | 6.0 | 24 | 2.1554 | 0.155 | 1.2534 | 6.9190 | 0.155 | 0.0279 | 0.5987 | 0.6271 |
65
+ | No log | 7.0 | 28 | 1.5617 | 0.175 | 0.9637 | 5.7454 | 0.175 | 0.0462 | 0.3802 | 0.6485 |
66
+ | No log | 8.0 | 32 | 1.3267 | 0.245 | 0.8707 | 5.2368 | 0.245 | 0.0835 | 0.2961 | 0.5438 |
67
+ | No log | 9.0 | 36 | 1.2434 | 0.19 | 0.8886 | 5.0360 | 0.19 | 0.0471 | 0.3198 | 0.7720 |
68
+ | No log | 10.0 | 40 | 1.0721 | 0.305 | 0.8123 | 4.5157 | 0.305 | 0.1762 | 0.2684 | 0.5269 |
69
+ | No log | 11.0 | 44 | 1.1256 | 0.22 | 0.8429 | 3.9215 | 0.22 | 0.1083 | 0.2812 | 0.7346 |
70
+ | No log | 12.0 | 48 | 0.9865 | 0.35 | 0.7676 | 3.4553 | 0.35 | 0.2565 | 0.2884 | 0.4790 |
71
+ | No log | 13.0 | 52 | 1.0206 | 0.355 | 0.7899 | 3.3582 | 0.3550 | 0.2278 | 0.2954 | 0.5883 |
72
+ | No log | 14.0 | 56 | 0.9096 | 0.415 | 0.6994 | 3.2174 | 0.415 | 0.3147 | 0.2563 | 0.3596 |
73
+ | No log | 15.0 | 60 | 0.9187 | 0.415 | 0.7129 | 3.2059 | 0.415 | 0.2742 | 0.2941 | 0.3971 |
74
+ | No log | 16.0 | 64 | 0.8905 | 0.395 | 0.6956 | 2.9931 | 0.395 | 0.2618 | 0.2590 | 0.3826 |
75
+ | No log | 17.0 | 68 | 0.9108 | 0.425 | 0.7073 | 3.1634 | 0.425 | 0.2855 | 0.2995 | 0.3685 |
76
+ | No log | 18.0 | 72 | 0.8769 | 0.465 | 0.6706 | 3.1088 | 0.465 | 0.3652 | 0.2855 | 0.3261 |
77
+ | No log | 19.0 | 76 | 0.8585 | 0.475 | 0.6687 | 2.8710 | 0.4750 | 0.3884 | 0.2916 | 0.3282 |
78
+ | No log | 20.0 | 80 | 0.9822 | 0.405 | 0.7378 | 2.8889 | 0.405 | 0.3570 | 0.2850 | 0.4895 |
79
+ | No log | 21.0 | 84 | 0.9324 | 0.445 | 0.6992 | 2.7975 | 0.445 | 0.3553 | 0.3021 | 0.3762 |
80
+ | No log | 22.0 | 88 | 1.0330 | 0.42 | 0.7350 | 2.7487 | 0.4200 | 0.3506 | 0.2984 | 0.4771 |
81
+ | No log | 23.0 | 92 | 0.8755 | 0.455 | 0.6674 | 2.5903 | 0.455 | 0.3415 | 0.2570 | 0.3352 |
82
+ | No log | 24.0 | 96 | 0.8651 | 0.47 | 0.6443 | 2.8456 | 0.47 | 0.3800 | 0.2451 | 0.2975 |
83
+ | No log | 25.0 | 100 | 0.9567 | 0.445 | 0.7150 | 2.7083 | 0.445 | 0.3727 | 0.2667 | 0.4676 |
84
+ | No log | 26.0 | 104 | 1.0224 | 0.42 | 0.7376 | 2.4408 | 0.4200 | 0.3367 | 0.2968 | 0.5019 |
85
+ | No log | 27.0 | 108 | 0.8365 | 0.525 | 0.6407 | 2.6426 | 0.525 | 0.4496 | 0.2960 | 0.2657 |
86
+ | No log | 28.0 | 112 | 0.9798 | 0.425 | 0.7287 | 2.6379 | 0.425 | 0.3489 | 0.2640 | 0.4668 |
87
+ | No log | 29.0 | 116 | 0.9226 | 0.44 | 0.6965 | 2.5748 | 0.44 | 0.3669 | 0.2561 | 0.4054 |
88
+ | No log | 30.0 | 120 | 0.8303 | 0.49 | 0.6398 | 2.4839 | 0.49 | 0.3924 | 0.2981 | 0.2936 |
89
+ | No log | 31.0 | 124 | 0.8426 | 0.52 | 0.6478 | 2.5282 | 0.52 | 0.4322 | 0.3109 | 0.3084 |
90
+ | No log | 32.0 | 128 | 0.9111 | 0.45 | 0.6970 | 2.3870 | 0.45 | 0.3947 | 0.2837 | 0.4448 |
91
+ | No log | 33.0 | 132 | 0.8723 | 0.51 | 0.6524 | 2.6124 | 0.51 | 0.4170 | 0.2536 | 0.3365 |
92
+ | No log | 34.0 | 136 | 0.8936 | 0.47 | 0.6671 | 2.8892 | 0.47 | 0.3814 | 0.2436 | 0.3357 |
93
+ | No log | 35.0 | 140 | 1.2870 | 0.42 | 0.7660 | 4.4020 | 0.4200 | 0.3468 | 0.2860 | 0.4606 |
94
+ | No log | 36.0 | 144 | 0.9991 | 0.455 | 0.7289 | 2.6973 | 0.455 | 0.4132 | 0.3272 | 0.4684 |
95
+ | No log | 37.0 | 148 | 1.6352 | 0.365 | 0.8356 | 4.7695 | 0.3650 | 0.3020 | 0.3312 | 0.6069 |
96
+ | No log | 38.0 | 152 | 1.3014 | 0.39 | 0.8213 | 2.9436 | 0.39 | 0.3382 | 0.3262 | 0.5476 |
97
+ | No log | 39.0 | 156 | 1.0294 | 0.415 | 0.7361 | 2.7188 | 0.415 | 0.3446 | 0.2454 | 0.4632 |
98
+ | No log | 40.0 | 160 | 0.8825 | 0.52 | 0.6538 | 2.3887 | 0.52 | 0.4608 | 0.2721 | 0.3186 |
99
+ | No log | 41.0 | 164 | 0.8572 | 0.54 | 0.6288 | 2.4201 | 0.54 | 0.4822 | 0.2963 | 0.2899 |
100
+ | No log | 42.0 | 168 | 0.8393 | 0.535 | 0.6291 | 2.3587 | 0.535 | 0.4726 | 0.2824 | 0.2937 |
101
+ | No log | 43.0 | 172 | 0.8369 | 0.515 | 0.6303 | 2.4060 | 0.515 | 0.4583 | 0.2689 | 0.2903 |
102
+ | No log | 44.0 | 176 | 0.8458 | 0.49 | 0.6346 | 2.3323 | 0.49 | 0.4428 | 0.2526 | 0.2951 |
103
+ | No log | 45.0 | 180 | 0.8446 | 0.49 | 0.6367 | 2.2207 | 0.49 | 0.4289 | 0.2655 | 0.3041 |
104
+ | No log | 46.0 | 184 | 0.8324 | 0.54 | 0.6289 | 2.3685 | 0.54 | 0.4779 | 0.2571 | 0.2873 |
105
+ | No log | 47.0 | 188 | 0.8658 | 0.515 | 0.6486 | 2.3922 | 0.515 | 0.4584 | 0.2623 | 0.3100 |
106
+ | No log | 48.0 | 192 | 0.8516 | 0.525 | 0.6410 | 2.4448 | 0.525 | 0.4700 | 0.3006 | 0.3044 |
107
+ | No log | 49.0 | 196 | 0.8520 | 0.55 | 0.6350 | 2.2049 | 0.55 | 0.4947 | 0.3030 | 0.2980 |
108
+ | No log | 50.0 | 200 | 0.8577 | 0.53 | 0.6406 | 2.1208 | 0.53 | 0.4957 | 0.3004 | 0.3168 |
109
+
110
+
111
+ ### Framework versions
112
+
113
+ - Transformers 4.36.0.dev0
114
+ - Pytorch 2.2.0.dev20231112+cu118
115
+ - Datasets 2.14.5
116
+ - Tokenizers 0.14.1
config.json ADDED
@@ -0,0 +1,65 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "bdpc/resnet101-base_tobacco",
3
+ "architectures": [
4
+ "ResNetForImageClassification"
5
+ ],
6
+ "depths": [
7
+ 3,
8
+ 4,
9
+ 23,
10
+ 3
11
+ ],
12
+ "downsample_in_bottleneck": false,
13
+ "downsample_in_first_stage": false,
14
+ "embedding_size": 64,
15
+ "hidden_act": "relu",
16
+ "hidden_sizes": [
17
+ 256,
18
+ 512,
19
+ 1024,
20
+ 2048
21
+ ],
22
+ "id2label": {
23
+ "0": "ADVE",
24
+ "1": "Email",
25
+ "2": "Form",
26
+ "3": "Letter",
27
+ "4": "Memo",
28
+ "5": "News",
29
+ "6": "Note",
30
+ "7": "Report",
31
+ "8": "Resume",
32
+ "9": "Scientific"
33
+ },
34
+ "label2id": {
35
+ "ADVE": 0,
36
+ "Email": 1,
37
+ "Form": 2,
38
+ "Letter": 3,
39
+ "Memo": 4,
40
+ "News": 5,
41
+ "Note": 6,
42
+ "Report": 7,
43
+ "Resume": 8,
44
+ "Scientific": 9
45
+ },
46
+ "layer_type": "bottleneck",
47
+ "model_type": "resnet",
48
+ "num_channels": 3,
49
+ "out_features": [
50
+ "stage4"
51
+ ],
52
+ "out_indices": [
53
+ 4
54
+ ],
55
+ "problem_type": "single_label_classification",
56
+ "stage_names": [
57
+ "stem",
58
+ "stage1",
59
+ "stage2",
60
+ "stage3",
61
+ "stage4"
62
+ ],
63
+ "torch_dtype": "float32",
64
+ "transformers_version": "4.36.0.dev0"
65
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:08bb4ceebc7f9b4ae151d9bde3a761582adf5b8a63b365c1a4cbfc24c6d85c0f
3
+ size 170587112
test-logits.npz ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:bd9d27565bd2d106e02fb2c1f406be5633218b43f2632b0e3ba3d6754d03eada
3
+ size 92066
test-references.npz ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a2afcfdc977d6e963da44f7d0b6169569f722c36f36eb2c2798b49630510363b
3
+ size 2128
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c047868033fa1a6b088acb9e506eef8fb2b9d5163a45f7030ccf11b1b2ce3c1b
3
+ size 4920
validation-logits.npz ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2150bb474f8900c0bf778921a7873f25c42c96a4050c17c66937999b770541f9
3
+ size 7641
validation-references.npz ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0354b78de1e153edfd908a412b596b1a05abea3df9a94323763cbb1ee2631790
3
+ size 423