finetune-instance-segmentation-ade20k-mini-mask2former_3
This model is a fine-tuned version of facebook/mask2former-swin-tiny-coco-instance on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 21.8296
- Map: 0.6941
- Map 50: 0.8672
- Map 75: 0.7324
- Map Small: 0.1837
- Map Medium: 0.533
- Map Large: 0.8476
- Mar 1: 0.5536
- Mar 10: 0.7509
- Mar 100: 0.7971
- Mar Small: 0.4162
- Mar Medium: 0.7363
- Mar Large: 0.9159
- Map Background: 0.9263
- Mar 100 Background: 0.9435
- Map Building: 0.462
- Mar 100 Building: 0.6506
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 64
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 128
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: constant
- num_epochs: 30.0
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Background | Mar 100 Background | Map Building | Mar 100 Building |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
113.7971 | 1.0 | 18 | 54.5754 | 0.0104 | 0.0275 | 0.0056 | 0.0007 | 0.0092 | 0.0318 | 0.0158 | 0.1576 | 0.3476 | 0.0456 | 0.1433 | 0.3825 | 0.0169 | 0.5786 | 0.004 | 0.1165 |
46.1207 | 2.0 | 36 | 41.7255 | 0.1211 | 0.2383 | 0.1079 | 0.0297 | 0.1815 | 0.3339 | 0.1421 | 0.4984 | 0.6046 | 0.2304 | 0.5813 | 0.7349 | 0.088 | 0.7217 | 0.1542 | 0.4874 |
37.5319 | 3.0 | 54 | 35.0953 | 0.3438 | 0.5097 | 0.3674 | 0.0693 | 0.3147 | 0.5321 | 0.4204 | 0.6597 | 0.7112 | 0.2853 | 0.62 | 0.8425 | 0.4114 | 0.8886 | 0.2762 | 0.5339 |
32.2408 | 4.0 | 72 | 31.1007 | 0.4725 | 0.6521 | 0.5066 | 0.0955 | 0.3797 | 0.6318 | 0.5057 | 0.6913 | 0.7412 | 0.3187 | 0.6466 | 0.8723 | 0.6119 | 0.9191 | 0.3331 | 0.5634 |
29.0339 | 5.0 | 90 | 28.8429 | 0.5993 | 0.7915 | 0.6487 | 0.1077 | 0.4116 | 0.7603 | 0.5213 | 0.7062 | 0.7536 | 0.3345 | 0.6634 | 0.8849 | 0.8423 | 0.9273 | 0.3563 | 0.5798 |
27.0289 | 6.0 | 108 | 27.2669 | 0.6292 | 0.8239 | 0.6783 | 0.1232 | 0.4349 | 0.7877 | 0.5305 | 0.7117 | 0.7585 | 0.3505 | 0.6724 | 0.889 | 0.883 | 0.9253 | 0.3753 | 0.5916 |
25.7301 | 7.0 | 126 | 26.2551 | 0.6392 | 0.8322 | 0.6873 | 0.1325 | 0.4502 | 0.7974 | 0.532 | 0.7143 | 0.7647 | 0.3629 | 0.69 | 0.8908 | 0.8905 | 0.9248 | 0.3879 | 0.6047 |
24.7686 | 8.0 | 144 | 25.5236 | 0.6473 | 0.8391 | 0.6943 | 0.1394 | 0.4632 | 0.8027 | 0.5368 | 0.718 | 0.7679 | 0.3667 | 0.6967 | 0.8943 | 0.8956 | 0.9257 | 0.399 | 0.6102 |
24.1245 | 9.0 | 162 | 24.9991 | 0.6529 | 0.8439 | 0.6976 | 0.1428 | 0.4749 | 0.8062 | 0.539 | 0.7193 | 0.7697 | 0.3736 | 0.7027 | 0.895 | 0.8974 | 0.9234 | 0.4084 | 0.6161 |
23.5334 | 10.0 | 180 | 24.7225 | 0.6582 | 0.8458 | 0.7004 | 0.1506 | 0.4812 | 0.8093 | 0.5416 | 0.724 | 0.7744 | 0.3799 | 0.7082 | 0.8968 | 0.9016 | 0.9282 | 0.4149 | 0.6206 |
23.0677 | 11.0 | 198 | 24.3978 | 0.6632 | 0.849 | 0.7049 | 0.1521 | 0.4852 | 0.8161 | 0.5432 | 0.7258 | 0.7761 | 0.3796 | 0.7115 | 0.9006 | 0.9071 | 0.9289 | 0.4193 | 0.6233 |
22.597 | 12.0 | 216 | 24.0400 | 0.6675 | 0.8523 | 0.707 | 0.157 | 0.4926 | 0.8196 | 0.5438 | 0.7295 | 0.7784 | 0.3842 | 0.7136 | 0.9032 | 0.9094 | 0.9301 | 0.4255 | 0.6267 |
22.2306 | 13.0 | 234 | 23.8523 | 0.6705 | 0.8544 | 0.7078 | 0.1584 | 0.4932 | 0.822 | 0.5466 | 0.7306 | 0.7817 | 0.3844 | 0.7202 | 0.9055 | 0.9134 | 0.9335 | 0.4276 | 0.6299 |
21.8893 | 14.0 | 252 | 23.6264 | 0.6735 | 0.8559 | 0.7123 | 0.1607 | 0.5 | 0.8253 | 0.548 | 0.7344 | 0.7829 | 0.3922 | 0.7166 | 0.9055 | 0.9144 | 0.9349 | 0.4327 | 0.6308 |
21.6071 | 15.0 | 270 | 23.4112 | 0.676 | 0.8575 | 0.7141 | 0.1625 | 0.5027 | 0.8278 | 0.5492 | 0.7355 | 0.7836 | 0.3919 | 0.7178 | 0.9063 | 0.9167 | 0.9358 | 0.4353 | 0.6314 |
21.2754 | 16.0 | 288 | 23.1654 | 0.6782 | 0.8573 | 0.7177 | 0.1649 | 0.5109 | 0.8295 | 0.5478 | 0.7382 | 0.7869 | 0.3969 | 0.7282 | 0.9077 | 0.9166 | 0.9353 | 0.4398 | 0.6385 |
21.0393 | 17.0 | 306 | 23.0104 | 0.6796 | 0.8592 | 0.7186 | 0.1682 | 0.5086 | 0.8329 | 0.5484 | 0.7395 | 0.7868 | 0.396 | 0.7247 | 0.9095 | 0.9184 | 0.9365 | 0.4408 | 0.6371 |
20.8053 | 18.0 | 324 | 22.8456 | 0.683 | 0.8598 | 0.7218 | 0.1705 | 0.5171 | 0.8342 | 0.5503 | 0.7417 | 0.7895 | 0.4014 | 0.7308 | 0.9109 | 0.9195 | 0.9369 | 0.4465 | 0.6422 |
20.5247 | 19.0 | 342 | 22.7148 | 0.6829 | 0.8614 | 0.7207 | 0.1696 | 0.5175 | 0.8339 | 0.5499 | 0.7417 | 0.7895 | 0.4034 | 0.729 | 0.9102 | 0.9201 | 0.9373 | 0.4457 | 0.6417 |
20.4118 | 20.0 | 360 | 22.5014 | 0.6844 | 0.8605 | 0.7211 | 0.1705 | 0.5185 | 0.8375 | 0.5516 | 0.7432 | 0.7909 | 0.4025 | 0.7317 | 0.9109 | 0.9212 | 0.9394 | 0.4476 | 0.6425 |
20.1455 | 21.0 | 378 | 22.4692 | 0.6863 | 0.864 | 0.7234 | 0.1715 | 0.5212 | 0.8375 | 0.5513 | 0.7445 | 0.7924 | 0.4055 | 0.7332 | 0.9116 | 0.9226 | 0.9406 | 0.4501 | 0.6443 |
19.904 | 22.0 | 396 | 22.4235 | 0.6877 | 0.8654 | 0.7244 | 0.1737 | 0.5248 | 0.8411 | 0.5526 | 0.7432 | 0.792 | 0.4058 | 0.7314 | 0.9133 | 0.9227 | 0.9396 | 0.4526 | 0.6445 |
19.6888 | 23.0 | 414 | 22.3299 | 0.6902 | 0.8652 | 0.7264 | 0.1769 | 0.5283 | 0.8421 | 0.5531 | 0.7446 | 0.7929 | 0.409 | 0.7326 | 0.9143 | 0.924 | 0.9392 | 0.4563 | 0.6466 |
19.4733 | 24.0 | 432 | 22.2339 | 0.69 | 0.8649 | 0.7267 | 0.1763 | 0.5277 | 0.8426 | 0.5527 | 0.7463 | 0.7937 | 0.4107 | 0.7318 | 0.9142 | 0.9243 | 0.9408 | 0.4558 | 0.6465 |
19.3223 | 25.0 | 450 | 22.1182 | 0.6902 | 0.8656 | 0.7275 | 0.1793 | 0.5286 | 0.843 | 0.5525 | 0.7469 | 0.7939 | 0.4105 | 0.7342 | 0.9159 | 0.9234 | 0.9394 | 0.457 | 0.6484 |
19.1744 | 26.0 | 468 | 22.0561 | 0.6915 | 0.8678 | 0.7304 | 0.1798 | 0.5284 | 0.8448 | 0.5524 | 0.7457 | 0.7936 | 0.4125 | 0.7326 | 0.9133 | 0.9238 | 0.9398 | 0.4592 | 0.6474 |
19.002 | 27.0 | 486 | 22.0421 | 0.6932 | 0.8676 | 0.7294 | 0.1811 | 0.5334 | 0.8447 | 0.5522 | 0.7485 | 0.7966 | 0.4156 | 0.7399 | 0.9151 | 0.9251 | 0.941 | 0.4612 | 0.6522 |
18.8741 | 28.0 | 504 | 21.9818 | 0.6939 | 0.8683 | 0.7334 | 0.1836 | 0.5328 | 0.847 | 0.553 | 0.7494 | 0.7968 | 0.4165 | 0.7361 | 0.9173 | 0.9263 | 0.9421 | 0.4615 | 0.6514 |
19.2285 | 28.3429 | 510 | 21.8296 | 0.6941 | 0.8672 | 0.7324 | 0.1837 | 0.533 | 0.8476 | 0.5536 | 0.7509 | 0.7971 | 0.4162 | 0.7363 | 0.9159 | 0.9263 | 0.9435 | 0.462 | 0.6506 |
Framework versions
- Transformers 4.47.0
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 5
Model tree for Golder/finetune-instance-segmentation-ade20k-mini-mask2former_3
Base model
facebook/mask2former-swin-tiny-coco-instance