badinkajink commited on
Commit
1e46cc5
·
1 Parent(s): b1f0943

End of training

Browse files
README.md ADDED
@@ -0,0 +1,101 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: other
3
+ base_model: nvidia/mit-b0
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: segformer-webots-grasp
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # segformer-webots-grasp
15
+
16
+ This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on an unknown dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 4.4405
19
+ - Mean Iou: 0.0802
20
+ - Mean Accuracy: 0.8142
21
+ - Overall Accuracy: 0.8142
22
+ - Per Category Iou: [0.8018109121927621, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0]
23
+ - Per Category Accuracy: [0.8141941463133944, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
24
+
25
+ ## Model description
26
+
27
+ More information needed
28
+
29
+ ## Intended uses & limitations
30
+
31
+ More information needed
32
+
33
+ ## Training and evaluation data
34
+
35
+ More information needed
36
+
37
+ ## Training procedure
38
+
39
+ ### Training hyperparameters
40
+
41
+ The following hyperparameters were used during training:
42
+ - learning_rate: 6e-05
43
+ - train_batch_size: 2
44
+ - eval_batch_size: 2
45
+ - seed: 42
46
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
47
+ - lr_scheduler_type: linear
48
+ - num_epochs: 1
49
+
50
+ ### Training results
51
+
52
+ | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Per Category Iou | Per Category Accuracy |
53
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:--------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------:|
54
+ | 5.4277 | 0.03 | 1 | 5.4055 | 0.0001 | 0.0011 | 0.0011 | [0.0010665493208110168, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0] | [0.001069721346006343, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
55
+ | 5.3872 | 0.05 | 2 | 5.3965 | 0.0004 | 0.0053 | 0.0053 | [0.005245532234607124, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0] | [0.005252500148444879, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
56
+ | 5.3659 | 0.07 | 3 | 5.3862 | 0.0013 | 0.0160 | 0.0160 | [0.01595653583377678, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0] | [0.015969302211124733, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
57
+ | 5.2685 | 0.1 | 4 | 5.3741 | 0.0037 | 0.0445 | 0.0445 | [0.04445613702342968, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0] | [0.044479595103583916, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
58
+ | 5.2464 | 0.12 | 5 | 5.3603 | 0.0113 | 0.1359 | 0.1359 | [0.13585817440138379, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0] | [0.1359222528362154, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
59
+ | 5.1881 | 0.15 | 6 | 5.3461 | 0.0240 | 0.2880 | 0.2880 | [0.287886510883373, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0] | [0.2879980631769163, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
60
+ | 5.1556 | 0.17 | 7 | 5.3322 | 0.0329 | 0.3949 | 0.3949 | [0.3948099641784779, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0] | [0.3949236748463366, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
61
+ | 5.0952 | 0.2 | 8 | 5.3175 | 0.0400 | 0.4807 | 0.4807 | [0.4805425893875233, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, 0.0] | [0.4807401186212317, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
62
+ | 5.0421 | 0.23 | 9 | 5.3015 | 0.0606 | 0.5455 | 0.5455 | [0.5450016850554845, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] | [0.5454547402275828, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
63
+ | 5.0168 | 0.25 | 10 | 5.2843 | 0.0659 | 0.5936 | 0.5936 | [0.5926648159557198, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] | [0.5935808761369806, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
64
+ | 4.9417 | 0.28 | 11 | 5.2641 | 0.0575 | 0.6341 | 0.6341 | [0.6324597013631272, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] | [0.6340619477314868, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
65
+ | 4.8962 | 0.3 | 12 | 5.2394 | 0.0606 | 0.6696 | 0.6696 | [0.6670764824873869, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] | [0.6695910817989438, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
66
+ | 4.9963 | 0.33 | 13 | 5.2132 | 0.0574 | 0.6918 | 0.6918 | [0.6887950316615776, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] | [0.6918253700562499, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
67
+ | 4.9677 | 0.35 | 14 | 5.1887 | 0.0703 | 0.7061 | 0.7061 | [0.7026749604186757, nan, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] | [0.7060962792061474, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
68
+ | 4.921 | 0.38 | 15 | 5.1585 | 0.0799 | 0.7228 | 0.7228 | [0.7188799212945968, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] | [0.7228307612069762, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
69
+ | 4.9849 | 0.4 | 16 | 5.1298 | 0.0808 | 0.7311 | 0.7311 | [0.7268635588232967, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] | [0.7311038850932571, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
70
+ | 4.8045 | 0.42 | 17 | 5.0921 | 0.0823 | 0.7458 | 0.7458 | [0.7409411768283134, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] | [0.7458280867505752, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
71
+ | 4.6795 | 0.45 | 18 | 5.0554 | 0.0835 | 0.7575 | 0.7575 | [0.7518766884727517, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] | [0.7574526981157601, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
72
+ | 4.7354 | 0.47 | 19 | 5.0164 | 0.0848 | 0.7698 | 0.7698 | [0.7632263888994257, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] | [0.7697950481236874, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
73
+ | 4.64 | 0.5 | 20 | 4.9783 | 0.0771 | 0.7779 | 0.7779 | [0.7705429510590344, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] | [0.7778955474494109, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
74
+ | 4.6682 | 0.53 | 21 | 4.9336 | 0.0777 | 0.7858 | 0.7858 | [0.7773627351835143, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] | [0.7858173007762596, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
75
+ | 4.6887 | 0.55 | 22 | 4.8904 | 0.0782 | 0.7914 | 0.7914 | [0.7821032704184515, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] | [0.791440760086753, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
76
+ | 4.5486 | 0.57 | 23 | 4.8541 | 0.0785 | 0.7943 | 0.7943 | [0.7845670293951555, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] | [0.7943371186267411, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
77
+ | 4.6823 | 0.6 | 24 | 4.8202 | 0.0787 | 0.7972 | 0.7972 | [0.786876415221171, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] | [0.797181751012945, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
78
+ | 4.5797 | 0.62 | 25 | 4.7832 | 0.0789 | 0.8001 | 0.8001 | [0.7893493389876929, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] | [0.8000900463576524, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
79
+ | 4.5976 | 0.65 | 26 | 4.7564 | 0.0789 | 0.7999 | 0.7999 | [0.7892736529850544, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] | [0.7998681442186382, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
80
+ | 4.5417 | 0.68 | 27 | 4.7152 | 0.0793 | 0.8040 | 0.8040 | [0.7927347074267436, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] | [0.8040157247507503, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
81
+ | 4.608 | 0.7 | 28 | 4.6820 | 0.0794 | 0.8053 | 0.8053 | [0.7938510702940416, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] | [0.8053333643486208, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
82
+ | 4.4955 | 0.72 | 29 | 4.6511 | 0.0795 | 0.8062 | 0.8062 | [0.7946513690612669, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] | [0.8061980175109864, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
83
+ | 4.5359 | 0.75 | 30 | 4.6301 | 0.0794 | 0.8057 | 0.8057 | [0.7942641352298992, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] | [0.8056544337883806, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
84
+ | 4.4116 | 0.78 | 31 | 4.5990 | 0.0797 | 0.8094 | 0.8094 | [0.7974774354228353, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] | [0.8094237094324631, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
85
+ | 4.4995 | 0.8 | 32 | 4.5779 | 0.0798 | 0.8098 | 0.8098 | [0.7978091719139048, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] | [0.8097701828412411, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
86
+ | 4.5087 | 0.82 | 33 | 4.5585 | 0.0799 | 0.8112 | 0.8112 | [0.7989926828268309, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] | [0.8112010690479878, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
87
+ | 4.4534 | 0.85 | 34 | 4.5346 | 0.0799 | 0.8116 | 0.8116 | [0.7994616833790221, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] | [0.8116476279732591, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
88
+ | 4.4347 | 0.88 | 35 | 4.5080 | 0.0800 | 0.8121 | 0.8121 | [0.7999173944895492, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] | [0.8121012265525958, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
89
+ | 4.4389 | 0.9 | 36 | 4.4889 | 0.0800 | 0.8119 | 0.8119 | [0.7998204784246967, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] | [0.8119105437490014, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
90
+ | 4.428 | 0.93 | 37 | 4.4843 | 0.0799 | 0.8105 | 0.8105 | [0.7987525641385264, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] | [0.8105448508603376, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
91
+ | 4.5333 | 0.95 | 38 | 4.4847 | 0.0798 | 0.8094 | 0.8094 | [0.7978741861189972, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] | [0.8093536189637262, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
92
+ | 4.4458 | 0.97 | 39 | 4.4634 | 0.0798 | 0.8098 | 0.8098 | [0.7981748026222505, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] | [0.8097799771425493, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
93
+ | 4.4749 | 1.0 | 40 | 4.4405 | 0.0802 | 0.8142 | 0.8142 | [0.8018109121927621, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0] | [0.8141941463133944, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
94
+
95
+
96
+ ### Framework versions
97
+
98
+ - Transformers 4.35.2
99
+ - Pytorch 2.1.0+cu118
100
+ - Datasets 2.15.0
101
+ - Tokenizers 0.15.0
config.json ADDED
@@ -0,0 +1,524 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "nvidia/mit-b0",
3
+ "architectures": [
4
+ "SegformerForSemanticSegmentation"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.0,
7
+ "classifier_dropout_prob": 0.1,
8
+ "decoder_hidden_size": 256,
9
+ "depths": [
10
+ 2,
11
+ 2,
12
+ 2,
13
+ 2
14
+ ],
15
+ "downsampling_rates": [
16
+ 1,
17
+ 4,
18
+ 8,
19
+ 16
20
+ ],
21
+ "drop_path_rate": 0.1,
22
+ "hidden_act": "gelu",
23
+ "hidden_dropout_prob": 0.0,
24
+ "hidden_sizes": [
25
+ 32,
26
+ 64,
27
+ 160,
28
+ 256
29
+ ],
30
+ "id2label": {
31
+ "0": "dummy0",
32
+ "1": "dummy1",
33
+ "2": "dummy2",
34
+ "3": "dummy3",
35
+ "4": "dummy4",
36
+ "5": "dummy5",
37
+ "6": "dummy6",
38
+ "7": "dummy7",
39
+ "8": "dummy8",
40
+ "9": "dummy9",
41
+ "10": "dummy10",
42
+ "11": "dummy11",
43
+ "12": "dummy12",
44
+ "13": "dummy13",
45
+ "14": "dummy14",
46
+ "15": "dummy15",
47
+ "16": "dummy16",
48
+ "17": "dummy17",
49
+ "18": "dummy18",
50
+ "19": "dummy19",
51
+ "20": "dummy20",
52
+ "21": "dummy21",
53
+ "22": "dummy22",
54
+ "23": "dummy23",
55
+ "24": "dummy24",
56
+ "25": "dummy25",
57
+ "26": "dummy26",
58
+ "27": "dummy27",
59
+ "28": "dummy28",
60
+ "29": "dummy29",
61
+ "30": "dummy30",
62
+ "31": "dummy31",
63
+ "32": "dummy32",
64
+ "33": "dummy33",
65
+ "34": "dummy34",
66
+ "35": "dummy35",
67
+ "36": "dummy36",
68
+ "37": "dummy37",
69
+ "38": "dummy38",
70
+ "39": "dummy39",
71
+ "40": "dummy40",
72
+ "41": "dummy41",
73
+ "42": "dummy42",
74
+ "43": "dummy43",
75
+ "44": "dummy44",
76
+ "45": "dummy45",
77
+ "46": "dummy46",
78
+ "47": "dummy47",
79
+ "48": "dummy48",
80
+ "49": "dummy49",
81
+ "50": "dummy50",
82
+ "51": "dummy51",
83
+ "52": "dummy52",
84
+ "53": "dummy53",
85
+ "54": "dummy54",
86
+ "55": "dummy55",
87
+ "56": "dummy56",
88
+ "57": "dummy57",
89
+ "58": "dummy58",
90
+ "59": "dummy59",
91
+ "60": "dummy60",
92
+ "61": "dummy61",
93
+ "62": "dummy62",
94
+ "63": "dummy63",
95
+ "64": "dummy64",
96
+ "65": "dummy65",
97
+ "66": "dummy66",
98
+ "67": "dummy67",
99
+ "68": "dummy68",
100
+ "69": "dummy69",
101
+ "70": "dummy70",
102
+ "71": "dummy71",
103
+ "72": "dummy72",
104
+ "73": "dummy73",
105
+ "74": "dummy74",
106
+ "75": "dummy75",
107
+ "76": "dummy76",
108
+ "77": "dummy77",
109
+ "78": "dummy78",
110
+ "79": "dummy79",
111
+ "80": "dummy80",
112
+ "81": "dummy81",
113
+ "82": "dummy82",
114
+ "83": "dummy83",
115
+ "84": "dummy84",
116
+ "85": "dummy85",
117
+ "86": "dummy86",
118
+ "87": "dummy87",
119
+ "88": "dummy88",
120
+ "89": "dummy89",
121
+ "90": "dummy90",
122
+ "91": "dummy91",
123
+ "92": "dummy92",
124
+ "93": "dummy93",
125
+ "94": "dummy94",
126
+ "95": "dummy95",
127
+ "96": "dummy96",
128
+ "97": "dummy97",
129
+ "98": "dummy98",
130
+ "99": "dummy99",
131
+ "100": "dummy100",
132
+ "101": "dummy101",
133
+ "102": "dummy102",
134
+ "103": "dummy103",
135
+ "104": "dummy104",
136
+ "105": "dummy105",
137
+ "106": "dummy106",
138
+ "107": "dummy107",
139
+ "108": "dummy108",
140
+ "109": "dummy109",
141
+ "110": "dummy110",
142
+ "111": "dummy111",
143
+ "112": "dummy112",
144
+ "113": "dummy113",
145
+ "114": "dummy114",
146
+ "115": "dummy115",
147
+ "116": "dummy116",
148
+ "117": "dummy117",
149
+ "118": "dummy118",
150
+ "119": "dummy119",
151
+ "120": "dummy120",
152
+ "121": "dummy121",
153
+ "122": "dummy122",
154
+ "123": "dummy123",
155
+ "124": "dummy124",
156
+ "125": "dummy125",
157
+ "126": "dummy126",
158
+ "127": "dummy127",
159
+ "128": "dummy128",
160
+ "129": "dummy129",
161
+ "130": "dummy130",
162
+ "131": "dummy131",
163
+ "132": "dummy132",
164
+ "133": "dummy133",
165
+ "134": "dummy134",
166
+ "135": "dummy135",
167
+ "136": "dummy136",
168
+ "137": "dummy137",
169
+ "138": "dummy138",
170
+ "139": "dummy139",
171
+ "140": "dummy140",
172
+ "141": "dummy141",
173
+ "142": "dummy142",
174
+ "143": "dummy143",
175
+ "144": "dummy144",
176
+ "145": "dummy145",
177
+ "146": "dummy146",
178
+ "147": "dummy147",
179
+ "148": "dummy148",
180
+ "149": "dummy149",
181
+ "150": "dummy150",
182
+ "151": "dummy151",
183
+ "152": "dummy152",
184
+ "153": "dummy153",
185
+ "154": "dummy154",
186
+ "155": "dummy155",
187
+ "156": "dummy156",
188
+ "157": "dummy157",
189
+ "158": "dummy158",
190
+ "159": "dummy159",
191
+ "160": "dummy160",
192
+ "161": "dummy161",
193
+ "162": "dummy162",
194
+ "163": "dummy163",
195
+ "164": "dummy164",
196
+ "165": "dummy165",
197
+ "166": "dummy166",
198
+ "167": "dummy167",
199
+ "168": "dummy168",
200
+ "169": "dummy169",
201
+ "170": "dummy170",
202
+ "171": "dummy171",
203
+ "172": "dummy172",
204
+ "173": "dummy173",
205
+ "174": "dummy174",
206
+ "175": "dummy175",
207
+ "176": "dummy176",
208
+ "177": "dummy177",
209
+ "178": "dummy178",
210
+ "179": "dummy179",
211
+ "180": "dummy180",
212
+ "181": "dummy181",
213
+ "182": "dummy182",
214
+ "183": "dummy183",
215
+ "184": "dummy184",
216
+ "185": "dummy185",
217
+ "186": "dummy186",
218
+ "187": "dummy187",
219
+ "188": "dummy188",
220
+ "189": "dummy189",
221
+ "190": "dummy190",
222
+ "191": "dummy191",
223
+ "192": "dummy192",
224
+ "193": "dummy193",
225
+ "194": "dummy194",
226
+ "195": "dummy195",
227
+ "196": "dummy196",
228
+ "197": "dummy197",
229
+ "198": "dummy198",
230
+ "199": "dummy199",
231
+ "200": "dummy200",
232
+ "201": "dummy201",
233
+ "202": "dummy202",
234
+ "203": "dummy203",
235
+ "204": "dummy204",
236
+ "205": "dummy205",
237
+ "206": "dummy206",
238
+ "207": "dummy207",
239
+ "208": "dummy208",
240
+ "209": "dummy209",
241
+ "210": "dummy210",
242
+ "211": "dummy211",
243
+ "212": "dummy212",
244
+ "213": "dummy213",
245
+ "214": "dummy214",
246
+ "215": "dummy215",
247
+ "216": "dummy216",
248
+ "217": "dummy217",
249
+ "218": "dummy218",
250
+ "219": "dummy219",
251
+ "220": "dummy220",
252
+ "221": "dummy221",
253
+ "222": "dummy222",
254
+ "223": "dummy223",
255
+ "224": "dummy224"
256
+ },
257
+ "image_size": 224,
258
+ "initializer_range": 0.02,
259
+ "label2id": {
260
+ "dummy0": 0,
261
+ "dummy1": 1,
262
+ "dummy10": 10,
263
+ "dummy100": 100,
264
+ "dummy101": 101,
265
+ "dummy102": 102,
266
+ "dummy103": 103,
267
+ "dummy104": 104,
268
+ "dummy105": 105,
269
+ "dummy106": 106,
270
+ "dummy107": 107,
271
+ "dummy108": 108,
272
+ "dummy109": 109,
273
+ "dummy11": 11,
274
+ "dummy110": 110,
275
+ "dummy111": 111,
276
+ "dummy112": 112,
277
+ "dummy113": 113,
278
+ "dummy114": 114,
279
+ "dummy115": 115,
280
+ "dummy116": 116,
281
+ "dummy117": 117,
282
+ "dummy118": 118,
283
+ "dummy119": 119,
284
+ "dummy12": 12,
285
+ "dummy120": 120,
286
+ "dummy121": 121,
287
+ "dummy122": 122,
288
+ "dummy123": 123,
289
+ "dummy124": 124,
290
+ "dummy125": 125,
291
+ "dummy126": 126,
292
+ "dummy127": 127,
293
+ "dummy128": 128,
294
+ "dummy129": 129,
295
+ "dummy13": 13,
296
+ "dummy130": 130,
297
+ "dummy131": 131,
298
+ "dummy132": 132,
299
+ "dummy133": 133,
300
+ "dummy134": 134,
301
+ "dummy135": 135,
302
+ "dummy136": 136,
303
+ "dummy137": 137,
304
+ "dummy138": 138,
305
+ "dummy139": 139,
306
+ "dummy14": 14,
307
+ "dummy140": 140,
308
+ "dummy141": 141,
309
+ "dummy142": 142,
310
+ "dummy143": 143,
311
+ "dummy144": 144,
312
+ "dummy145": 145,
313
+ "dummy146": 146,
314
+ "dummy147": 147,
315
+ "dummy148": 148,
316
+ "dummy149": 149,
317
+ "dummy15": 15,
318
+ "dummy150": 150,
319
+ "dummy151": 151,
320
+ "dummy152": 152,
321
+ "dummy153": 153,
322
+ "dummy154": 154,
323
+ "dummy155": 155,
324
+ "dummy156": 156,
325
+ "dummy157": 157,
326
+ "dummy158": 158,
327
+ "dummy159": 159,
328
+ "dummy16": 16,
329
+ "dummy160": 160,
330
+ "dummy161": 161,
331
+ "dummy162": 162,
332
+ "dummy163": 163,
333
+ "dummy164": 164,
334
+ "dummy165": 165,
335
+ "dummy166": 166,
336
+ "dummy167": 167,
337
+ "dummy168": 168,
338
+ "dummy169": 169,
339
+ "dummy17": 17,
340
+ "dummy170": 170,
341
+ "dummy171": 171,
342
+ "dummy172": 172,
343
+ "dummy173": 173,
344
+ "dummy174": 174,
345
+ "dummy175": 175,
346
+ "dummy176": 176,
347
+ "dummy177": 177,
348
+ "dummy178": 178,
349
+ "dummy179": 179,
350
+ "dummy18": 18,
351
+ "dummy180": 180,
352
+ "dummy181": 181,
353
+ "dummy182": 182,
354
+ "dummy183": 183,
355
+ "dummy184": 184,
356
+ "dummy185": 185,
357
+ "dummy186": 186,
358
+ "dummy187": 187,
359
+ "dummy188": 188,
360
+ "dummy189": 189,
361
+ "dummy19": 19,
362
+ "dummy190": 190,
363
+ "dummy191": 191,
364
+ "dummy192": 192,
365
+ "dummy193": 193,
366
+ "dummy194": 194,
367
+ "dummy195": 195,
368
+ "dummy196": 196,
369
+ "dummy197": 197,
370
+ "dummy198": 198,
371
+ "dummy199": 199,
372
+ "dummy2": 2,
373
+ "dummy20": 20,
374
+ "dummy200": 200,
375
+ "dummy201": 201,
376
+ "dummy202": 202,
377
+ "dummy203": 203,
378
+ "dummy204": 204,
379
+ "dummy205": 205,
380
+ "dummy206": 206,
381
+ "dummy207": 207,
382
+ "dummy208": 208,
383
+ "dummy209": 209,
384
+ "dummy21": 21,
385
+ "dummy210": 210,
386
+ "dummy211": 211,
387
+ "dummy212": 212,
388
+ "dummy213": 213,
389
+ "dummy214": 214,
390
+ "dummy215": 215,
391
+ "dummy216": 216,
392
+ "dummy217": 217,
393
+ "dummy218": 218,
394
+ "dummy219": 219,
395
+ "dummy22": 22,
396
+ "dummy220": 220,
397
+ "dummy221": 221,
398
+ "dummy222": 222,
399
+ "dummy223": 223,
400
+ "dummy224": 224,
401
+ "dummy23": 23,
402
+ "dummy24": 24,
403
+ "dummy25": 25,
404
+ "dummy26": 26,
405
+ "dummy27": 27,
406
+ "dummy28": 28,
407
+ "dummy29": 29,
408
+ "dummy3": 3,
409
+ "dummy30": 30,
410
+ "dummy31": 31,
411
+ "dummy32": 32,
412
+ "dummy33": 33,
413
+ "dummy34": 34,
414
+ "dummy35": 35,
415
+ "dummy36": 36,
416
+ "dummy37": 37,
417
+ "dummy38": 38,
418
+ "dummy39": 39,
419
+ "dummy4": 4,
420
+ "dummy40": 40,
421
+ "dummy41": 41,
422
+ "dummy42": 42,
423
+ "dummy43": 43,
424
+ "dummy44": 44,
425
+ "dummy45": 45,
426
+ "dummy46": 46,
427
+ "dummy47": 47,
428
+ "dummy48": 48,
429
+ "dummy49": 49,
430
+ "dummy5": 5,
431
+ "dummy50": 50,
432
+ "dummy51": 51,
433
+ "dummy52": 52,
434
+ "dummy53": 53,
435
+ "dummy54": 54,
436
+ "dummy55": 55,
437
+ "dummy56": 56,
438
+ "dummy57": 57,
439
+ "dummy58": 58,
440
+ "dummy59": 59,
441
+ "dummy6": 6,
442
+ "dummy60": 60,
443
+ "dummy61": 61,
444
+ "dummy62": 62,
445
+ "dummy63": 63,
446
+ "dummy64": 64,
447
+ "dummy65": 65,
448
+ "dummy66": 66,
449
+ "dummy67": 67,
450
+ "dummy68": 68,
451
+ "dummy69": 69,
452
+ "dummy7": 7,
453
+ "dummy70": 70,
454
+ "dummy71": 71,
455
+ "dummy72": 72,
456
+ "dummy73": 73,
457
+ "dummy74": 74,
458
+ "dummy75": 75,
459
+ "dummy76": 76,
460
+ "dummy77": 77,
461
+ "dummy78": 78,
462
+ "dummy79": 79,
463
+ "dummy8": 8,
464
+ "dummy80": 80,
465
+ "dummy81": 81,
466
+ "dummy82": 82,
467
+ "dummy83": 83,
468
+ "dummy84": 84,
469
+ "dummy85": 85,
470
+ "dummy86": 86,
471
+ "dummy87": 87,
472
+ "dummy88": 88,
473
+ "dummy89": 89,
474
+ "dummy9": 9,
475
+ "dummy90": 90,
476
+ "dummy91": 91,
477
+ "dummy92": 92,
478
+ "dummy93": 93,
479
+ "dummy94": 94,
480
+ "dummy95": 95,
481
+ "dummy96": 96,
482
+ "dummy97": 97,
483
+ "dummy98": 98,
484
+ "dummy99": 99
485
+ },
486
+ "layer_norm_eps": 1e-06,
487
+ "mlp_ratios": [
488
+ 4,
489
+ 4,
490
+ 4,
491
+ 4
492
+ ],
493
+ "model_type": "segformer",
494
+ "num_attention_heads": [
495
+ 1,
496
+ 2,
497
+ 5,
498
+ 8
499
+ ],
500
+ "num_channels": 3,
501
+ "num_encoder_blocks": 4,
502
+ "patch_sizes": [
503
+ 7,
504
+ 3,
505
+ 3,
506
+ 3
507
+ ],
508
+ "reshape_last_stage": true,
509
+ "semantic_loss_ignore_index": 255,
510
+ "sr_ratios": [
511
+ 8,
512
+ 4,
513
+ 2,
514
+ 1
515
+ ],
516
+ "strides": [
517
+ 4,
518
+ 2,
519
+ 2,
520
+ 2
521
+ ],
522
+ "torch_dtype": "float32",
523
+ "transformers_version": "4.35.2"
524
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:79d22707f5455b5e4b5fe20540a36565c535f5659748e1b41899aebfa80bb995
3
+ size 15114044
runs/Nov30_16-09-03_c609a227aa0c/events.out.tfevents.1701360544.c609a227aa0c.138.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2cbe6b3722f52c819bebb91db8d3c387b24252836443654455e9e1a21d0bec8f
3
+ size 5248
runs/Nov30_16-19-57_c609a227aa0c/events.out.tfevents.1701361198.c609a227aa0c.138.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:24c03fe208feaab77c6056c74bc164a512c311e7dd111686d68fe53a48ba4f94
3
+ size 17142
runs/Nov30_16-37-56_c609a227aa0c/events.out.tfevents.1701362277.c609a227aa0c.138.2 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:853785757d25a74631ebc65cc27323067c5d259eeb233c208257dc0a517f3ff8
3
+ size 14062
runs/Nov30_16-38-05_c609a227aa0c/events.out.tfevents.1701362286.c609a227aa0c.138.3 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:89351f82d2db007ea36821c1c10eaf4a212f67868676aa5b0942edc36b9656cd
3
+ size 14214
runs/Nov30_16-41-38_c609a227aa0c/events.out.tfevents.1701362499.c609a227aa0c.138.4 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:82fd5e76cf4a952676a8eb4fb05f0c7e6e67a5522b644f7111526c84aa2198f4
3
+ size 37848
runs/Nov30_19-55-59_c609a227aa0c/events.out.tfevents.1701374160.c609a227aa0c.138.5 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:561888fc879454f01ab8b87141344cefb2acf2593ceb726784965a29345e3f16
3
+ size 37848
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:49e9f8cb46f841b5814957b2f17b91933265413b1b89474a3a76bb7dc010f57f
3
+ size 4536