File size: 12,658 Bytes
0bb730f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
ce501b0
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
0bb730f
ce501b0
 
 
 
 
 
0bb730f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
ce501b0
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
0bb730f
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
---
license: apache-2.0
base_model: PekingU/rtdetr_r50vd_coco_o365
tags:
- generated_from_trainer
model-index:
- name: rtdetr-r50-cppe5-finetune-use_focal-False
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# rtdetr-r50-cppe5-finetune-use_focal-False

This model is a fine-tuned version of [PekingU/rtdetr_r50vd_coco_o365](https://huggingface.co/PekingU/rtdetr_r50vd_coco_o365) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 10.2265
- Map: 0.3751
- Map 50: 0.5356
- Map 75: 0.4176
- Map Small: 0.5517
- Map Medium: 0.3568
- Map Large: 0.5283
- Mar 1: 0.329
- Mar 10: 0.6005
- Mar 100: 0.6462
- Mar Small: 0.5882
- Mar Medium: 0.5601
- Mar Large: 0.7161
- Map Coverall: 0.3237
- Mar 100 Coverall: 0.8564
- Map Face Shield: 0.4291
- Mar 100 Face Shield: 0.8467
- Map Gloves: 0.6116
- Mar 100 Gloves: 0.7638
- Map Goggles: 0.5111
- Mar 100 Goggles: 0.7643
- Map Mask: 0.0
- Mar 100 Mask: 0.0

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 300
- num_epochs: 30

### Training results

| Training Loss | Epoch | Step | Validation Loss | Map    | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1  | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
| No log        | 1.0   | 106  | 48.0056         | 0.0096 | 0.0182 | 0.0078 | 0.0004    | 0.0011     | 0.0141    | 0.0279 | 0.0877 | 0.1286  | 0.0364    | 0.0457     | 0.2074    | 0.0454       | 0.3986           | 0.0008          | 0.0467              | 0.0014     | 0.0836         | 0.0002      | 0.1141          | 0.0      | 0.0          |
| No log        | 2.0   | 212  | 24.1177         | 0.0533 | 0.0967 | 0.0483 | 0.0429    | 0.0161     | 0.0799    | 0.0924 | 0.2249 | 0.2994  | 0.1024    | 0.1912     | 0.4803    | 0.1701       | 0.63             | 0.0057          | 0.3213              | 0.07       | 0.3301         | 0.0207      | 0.2156          | 0.0      | 0.0          |
| No log        | 3.0   | 318  | 19.6837         | 0.0909 | 0.1714 | 0.0842 | 0.1028    | 0.0561     | 0.1558    | 0.168  | 0.3546 | 0.4261  | 0.2402    | 0.3345     | 0.5601    | 0.1759       | 0.6977           | 0.0739          | 0.508               | 0.1338     | 0.5059         | 0.0708      | 0.4187          | 0.0      | 0.0          |
| No log        | 4.0   | 424  | 15.9325         | 0.1186 | 0.2136 | 0.1107 | 0.0801    | 0.0859     | 0.2414    | 0.1884 | 0.3751 | 0.4496  | 0.262     | 0.3839     | 0.5603    | 0.2001       | 0.7092           | 0.0708          | 0.5373              | 0.1727     | 0.5123         | 0.1493      | 0.4891          | 0.0      | 0.0          |
| 46.7509       | 5.0   | 530  | 15.7219         | 0.1652 | 0.2835 | 0.1715 | 0.1035    | 0.13       | 0.2798    | 0.2215 | 0.3943 | 0.4835  | 0.3143    | 0.4257     | 0.58      | 0.2815       | 0.7286           | 0.1155          | 0.616               | 0.2126     | 0.5466         | 0.2163      | 0.5266          | 0.0      | 0.0          |
| 46.7509       | 6.0   | 636  | 15.2971         | 0.1521 | 0.2608 | 0.1585 | 0.1615    | 0.107      | 0.3026    | 0.2048 | 0.3936 | 0.4832  | 0.3281    | 0.4196     | 0.5982    | 0.2162       | 0.7461           | 0.1248          | 0.608               | 0.2377     | 0.5525         | 0.1817      | 0.5094          | 0.0      | 0.0          |
| 46.7509       | 7.0   | 742  | 14.7975         | 0.1739 | 0.3033 | 0.1663 | 0.1774    | 0.1249     | 0.2946    | 0.22   | 0.4152 | 0.4941  | 0.3349    | 0.4265     | 0.6162    | 0.2714       | 0.7359           | 0.1555          | 0.64                | 0.2563     | 0.5648         | 0.1865      | 0.5297          | 0.0      | 0.0          |
| 46.7509       | 8.0   | 848  | 14.0780         | 0.173  | 0.3097 | 0.17   | 0.1574    | 0.1366     | 0.3098    | 0.2273 | 0.4161 | 0.491   | 0.3402    | 0.4332     | 0.6074    | 0.2117       | 0.7401           | 0.1694          | 0.6413              | 0.2664     | 0.5626         | 0.2175      | 0.5109          | 0.0      | 0.0          |
| 46.7509       | 9.0   | 954  | 14.7545         | 0.1838 | 0.3225 | 0.1884 | 0.1876    | 0.1506     | 0.3254    | 0.2366 | 0.4295 | 0.5001  | 0.3676    | 0.4306     | 0.6197    | 0.2092       | 0.7424           | 0.1855          | 0.648               | 0.2837     | 0.563          | 0.2407      | 0.5469          | 0.0      | 0.0          |
| 15.0328       | 10.0  | 1060 | 14.8555         | 0.1901 | 0.3198 | 0.1947 | 0.1801    | 0.1621     | 0.3094    | 0.2376 | 0.4254 | 0.4966  | 0.3667    | 0.4266     | 0.6081    | 0.2333       | 0.7355           | 0.2159          | 0.6267              | 0.274      | 0.5676         | 0.2271      | 0.5531          | 0.0      | 0.0          |
| 15.0328       | 11.0  | 1166 | 14.5398         | 0.2122 | 0.3554 | 0.2155 | 0.1985    | 0.1684     | 0.3698    | 0.2458 | 0.4265 | 0.5005  | 0.3759    | 0.4312     | 0.6075    | 0.3117       | 0.7475           | 0.2296          | 0.6307              | 0.2706     | 0.5667         | 0.2489      | 0.5578          | 0.0      | 0.0          |
| 15.0328       | 12.0  | 1272 | 13.9358         | 0.2154 | 0.3669 | 0.2171 | 0.1878    | 0.1602     | 0.3618    | 0.2421 | 0.4238 | 0.5003  | 0.3679    | 0.4206     | 0.6028    | 0.2657       | 0.7433           | 0.2847          | 0.6507              | 0.2909     | 0.5763         | 0.2358      | 0.5312          | 0.0      | 0.0          |
| 15.0328       | 13.0  | 1378 | 13.4203         | 0.2098 | 0.3538 | 0.2204 | 0.1888    | 0.1669     | 0.344     | 0.25   | 0.4176 | 0.5     | 0.3465    | 0.4387     | 0.6184    | 0.2666       | 0.7558           | 0.2454          | 0.6707              | 0.3017     | 0.5813         | 0.2355      | 0.4922          | 0.0      | 0.0          |
| 15.0328       | 14.0  | 1484 | 13.6884         | 0.1899 | 0.3258 | 0.1903 | 0.1759    | 0.169      | 0.3004    | 0.2455 | 0.4249 | 0.4988  | 0.3645    | 0.4277     | 0.6163    | 0.2431       | 0.7456           | 0.25            | 0.656               | 0.2834     | 0.5877         | 0.1732      | 0.5047          | 0.0      | 0.0          |
| 13.3953       | 15.0  | 1590 | 13.1320         | 0.1898 | 0.3258 | 0.192  | 0.1843    | 0.1431     | 0.3191    | 0.2492 | 0.4182 | 0.4935  | 0.362     | 0.4115     | 0.6128    | 0.2222       | 0.7401           | 0.2256          | 0.6347              | 0.2892     | 0.5694         | 0.2119      | 0.5234          | 0.0      | 0.0          |
| 13.3953       | 16.0  | 1696 | 12.9858         | 0.2019 | 0.3335 | 0.2047 | 0.1801    | 0.1637     | 0.3552    | 0.2554 | 0.4233 | 0.5005  | 0.3673    | 0.4268     | 0.6094    | 0.2745       | 0.7498           | 0.2364          | 0.644               | 0.2863     | 0.5712         | 0.2123      | 0.5375          | 0.0      | 0.0          |
| 13.3953       | 17.0  | 1802 | 12.9965         | 0.2013 | 0.3422 | 0.1992 | 0.1809    | 0.1569     | 0.3458    | 0.248  | 0.4273 | 0.4914  | 0.3632    | 0.4103     | 0.6139    | 0.2725       | 0.7392           | 0.227           | 0.628               | 0.283      | 0.5712         | 0.2238      | 0.5188          | 0.0      | 0.0          |
| 13.3953       | 18.0  | 1908 | 12.9245         | 0.1948 | 0.3346 | 0.1923 | 0.186     | 0.1505     | 0.3242    | 0.2386 | 0.4303 | 0.5049  | 0.393     | 0.4192     | 0.6206    | 0.2743       | 0.7392           | 0.219           | 0.644               | 0.2984     | 0.574          | 0.1824      | 0.5672          | 0.0      | 0.0          |
| 12.6173       | 19.0  | 2014 | 12.9508         | 0.209  | 0.3508 | 0.2072 | 0.1722    | 0.15       | 0.3759    | 0.2444 | 0.4307 | 0.5022  | 0.3574    | 0.4338     | 0.6149    | 0.2605       | 0.7456           | 0.2698          | 0.6533              | 0.2983     | 0.584          | 0.2166      | 0.5281          | 0.0      | 0.0          |
| 12.6173       | 20.0  | 2120 | 13.3318         | 0.2137 | 0.3577 | 0.223  | 0.1887    | 0.168      | 0.3762    | 0.2458 | 0.4233 | 0.4925  | 0.3564    | 0.4209     | 0.6174    | 0.2679       | 0.7382           | 0.2664          | 0.636               | 0.2961     | 0.5662         | 0.2382      | 0.5219          | 0.0      | 0.0          |
| 12.6173       | 21.0  | 2226 | 13.0245         | 0.2147 | 0.3564 | 0.2197 | 0.1747    | 0.1795     | 0.3656    | 0.2553 | 0.4266 | 0.4976  | 0.3564    | 0.4265     | 0.6323    | 0.2778       | 0.7387           | 0.2786          | 0.6533              | 0.299      | 0.5758         | 0.2182      | 0.5203          | 0.0      | 0.0          |
| 12.6173       | 22.0  | 2332 | 12.9212         | 0.2161 | 0.3697 | 0.2196 | 0.1798    | 0.1758     | 0.3509    | 0.2553 | 0.4358 | 0.5001  | 0.3676    | 0.4165     | 0.6155    | 0.2721       | 0.7355           | 0.2841          | 0.648               | 0.3008     | 0.5813         | 0.2235      | 0.5359          | 0.0      | 0.0          |
| 12.6173       | 23.0  | 2438 | 12.9598         | 0.2229 | 0.3751 | 0.2366 | 0.1823    | 0.177      | 0.3596    | 0.2533 | 0.4256 | 0.4976  | 0.3697    | 0.4193     | 0.6026    | 0.2901       | 0.7401           | 0.3074          | 0.6507              | 0.2954     | 0.5831         | 0.2218      | 0.5141          | 0.0      | 0.0          |
| 12.1016       | 24.0  | 2544 | 12.9207         | 0.2141 | 0.3637 | 0.2115 | 0.1954    | 0.1776     | 0.3447    | 0.2464 | 0.4278 | 0.4968  | 0.3665    | 0.4224     | 0.607     | 0.2536       | 0.7378           | 0.3149          | 0.6373              | 0.2955     | 0.5776         | 0.2064      | 0.5312          | 0.0      | 0.0          |
| 12.1016       | 25.0  | 2650 | 12.8912         | 0.2215 | 0.3703 | 0.2287 | 0.197     | 0.175      | 0.3619    | 0.2538 | 0.4312 | 0.4979  | 0.3657    | 0.4263     | 0.6035    | 0.2812       | 0.7465           | 0.3006          | 0.628               | 0.3082     | 0.5913         | 0.2173      | 0.5234          | 0.0      | 0.0          |
| 12.1016       | 26.0  | 2756 | 12.7256         | 0.2198 | 0.3724 | 0.2245 | 0.1926    | 0.1776     | 0.3522    | 0.2465 | 0.4341 | 0.5016  | 0.3742    | 0.4312     | 0.6073    | 0.2757       | 0.7479           | 0.2864          | 0.6333              | 0.3018     | 0.5831         | 0.2351      | 0.5437          | 0.0      | 0.0          |
| 12.1016       | 27.0  | 2862 | 12.7695         | 0.2259 | 0.3771 | 0.2409 | 0.1941    | 0.1854     | 0.3571    | 0.249  | 0.433  | 0.5027  | 0.3773    | 0.4412     | 0.591     | 0.2804       | 0.7447           | 0.3012          | 0.6333              | 0.3023     | 0.5872         | 0.2457      | 0.5484          | 0.0      | 0.0          |
| 12.1016       | 28.0  | 2968 | 12.8203         | 0.2231 | 0.3781 | 0.2333 | 0.1961    | 0.1837     | 0.3527    | 0.2492 | 0.4328 | 0.5001  | 0.3718    | 0.4331     | 0.607     | 0.2634       | 0.7456           | 0.3069          | 0.6347              | 0.3026     | 0.5872         | 0.2427      | 0.5328          | 0.0      | 0.0          |
| 11.5997       | 29.0  | 3074 | 12.7159         | 0.2205 | 0.3758 | 0.2306 | 0.194     | 0.177      | 0.3562    | 0.251  | 0.4322 | 0.5029  | 0.3882    | 0.4349     | 0.6057    | 0.258        | 0.7465           | 0.3073          | 0.6413              | 0.3037     | 0.5826         | 0.2336      | 0.5437          | 0.0      | 0.0          |
| 11.5997       | 30.0  | 3180 | 12.7243         | 0.2205 | 0.3735 | 0.2311 | 0.1938    | 0.1773     | 0.3609    | 0.2515 | 0.4326 | 0.5016  | 0.3869    | 0.4314     | 0.5982    | 0.2638       | 0.7438           | 0.2967          | 0.6347              | 0.3029     | 0.584          | 0.2393      | 0.5453          | 0.0      | 0.0          |


### Framework versions

- Transformers 4.45.0.dev0
- Pytorch 2.4.0+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1