mask2former-finetuned-ER-Mito-LD3
This model is a fine-tuned version of facebook/mask2former-swin-base-IN21k-ade-semantic on the Dnq2025/Mask2former_Pretrain dataset. It achieves the following results on the evaluation set:
- Loss: 39.9236
- Dummy: 1.0
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0004
- train_batch_size: 4
- eval_batch_size: 4
- seed: 1337
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: polynomial
- training_steps: 12900
Training results
Training Loss | Epoch | Step | Validation Loss | Dummy |
---|---|---|---|---|
57.794 | 1.0 | 129 | 47.5030 | 1.0 |
45.635 | 2.0 | 258 | 42.4622 | 1.0 |
43.6742 | 3.0 | 387 | 40.1383 | 1.0 |
37.5286 | 4.0 | 516 | 41.1964 | 1.0 |
33.7618 | 5.0 | 645 | 34.4374 | 1.0 |
31.5899 | 6.0 | 774 | 39.8242 | 1.0 |
29.0727 | 7.0 | 903 | 33.3223 | 1.0 |
27.8483 | 8.0 | 1032 | 30.9625 | 1.0 |
26.0904 | 9.0 | 1161 | 31.7084 | 1.0 |
26.1043 | 10.0 | 1290 | 31.8088 | 1.0 |
24.3038 | 11.0 | 1419 | 30.3361 | 1.0 |
23.6493 | 12.0 | 1548 | 30.2030 | 1.0 |
23.9146 | 13.0 | 1677 | 31.0806 | 1.0 |
21.9133 | 14.0 | 1806 | 31.3974 | 1.0 |
22.3071 | 15.0 | 1935 | 32.0925 | 1.0 |
21.0819 | 16.0 | 2064 | 29.9367 | 1.0 |
21.0089 | 17.0 | 2193 | 30.0420 | 1.0 |
20.9169 | 18.0 | 2322 | 29.2938 | 1.0 |
19.7935 | 19.0 | 2451 | 31.3945 | 1.0 |
19.8749 | 20.0 | 2580 | 29.8457 | 1.0 |
19.2973 | 21.0 | 2709 | 29.0713 | 1.0 |
18.5436 | 22.0 | 2838 | 29.0846 | 1.0 |
18.5996 | 23.0 | 2967 | 29.8810 | 1.0 |
19.1228 | 24.0 | 3096 | 29.3016 | 1.0 |
18.0519 | 25.0 | 3225 | 30.7155 | 1.0 |
17.7073 | 26.0 | 3354 | 28.7168 | 1.0 |
17.5055 | 27.0 | 3483 | 28.9899 | 1.0 |
17.4854 | 28.0 | 3612 | 30.1944 | 1.0 |
17.0048 | 29.0 | 3741 | 29.2829 | 1.0 |
16.8731 | 30.0 | 3870 | 30.1208 | 1.0 |
16.683 | 31.0 | 3999 | 30.7583 | 1.0 |
16.6109 | 32.0 | 4128 | 30.6232 | 1.0 |
15.8261 | 33.0 | 4257 | 29.4162 | 1.0 |
16.9002 | 34.0 | 4386 | 30.4388 | 1.0 |
16.3081 | 35.0 | 4515 | 29.9756 | 1.0 |
15.4745 | 36.0 | 4644 | 28.8214 | 1.0 |
15.938 | 37.0 | 4773 | 29.1001 | 1.0 |
15.9947 | 38.0 | 4902 | 31.0533 | 1.0 |
15.2328 | 39.0 | 5031 | 31.6211 | 1.0 |
15.202 | 40.0 | 5160 | 33.1383 | 1.0 |
15.0583 | 41.0 | 5289 | 31.4089 | 1.0 |
14.573 | 42.0 | 5418 | 31.5681 | 1.0 |
14.7401 | 43.0 | 5547 | 30.5548 | 1.0 |
14.6052 | 44.0 | 5676 | 31.3953 | 1.0 |
14.1299 | 45.0 | 5805 | 30.8153 | 1.0 |
13.6851 | 46.0 | 5934 | 30.9693 | 1.0 |
14.6677 | 47.0 | 6063 | 31.9361 | 1.0 |
13.6493 | 48.0 | 6192 | 34.3328 | 1.0 |
14.166 | 49.0 | 6321 | 32.6231 | 1.0 |
13.7388 | 50.0 | 6450 | 33.1736 | 1.0 |
13.0849 | 51.0 | 6579 | 34.9522 | 1.0 |
13.2502 | 52.0 | 6708 | 35.7990 | 1.0 |
13.5116 | 53.0 | 6837 | 31.5737 | 1.0 |
12.6993 | 54.0 | 6966 | 33.2650 | 1.0 |
13.3602 | 55.0 | 7095 | 34.8914 | 1.0 |
12.9585 | 56.0 | 7224 | 35.9862 | 1.0 |
12.7434 | 57.0 | 7353 | 34.9106 | 1.0 |
12.7299 | 58.0 | 7482 | 34.0106 | 1.0 |
12.717 | 59.0 | 7611 | 36.3588 | 1.0 |
12.0563 | 60.0 | 7740 | 35.0923 | 1.0 |
13.012 | 61.0 | 7869 | 38.7323 | 1.0 |
12.2878 | 62.0 | 7998 | 34.9967 | 1.0 |
12.2794 | 63.0 | 8127 | 37.5577 | 1.0 |
12.4147 | 64.0 | 8256 | 37.2733 | 1.0 |
12.0032 | 65.0 | 8385 | 35.3015 | 1.0 |
12.2793 | 66.0 | 8514 | 35.2806 | 1.0 |
12.2309 | 67.0 | 8643 | 36.2488 | 1.0 |
11.7082 | 68.0 | 8772 | 35.6687 | 1.0 |
11.8694 | 69.0 | 8901 | 36.0470 | 1.0 |
11.782 | 70.0 | 9030 | 35.4055 | 1.0 |
11.6254 | 71.0 | 9159 | 36.7066 | 1.0 |
11.5873 | 72.0 | 9288 | 36.1084 | 1.0 |
11.6251 | 73.0 | 9417 | 38.2932 | 1.0 |
11.4589 | 74.0 | 9546 | 36.5570 | 1.0 |
11.7378 | 75.0 | 9675 | 35.9887 | 1.0 |
11.4933 | 76.0 | 9804 | 36.4713 | 1.0 |
11.2566 | 77.0 | 9933 | 36.9622 | 1.0 |
11.25 | 78.0 | 10062 | 37.1016 | 1.0 |
11.2962 | 79.0 | 10191 | 37.8711 | 1.0 |
11.0868 | 80.0 | 10320 | 38.5714 | 1.0 |
11.2786 | 81.0 | 10449 | 38.1493 | 1.0 |
11.1528 | 82.0 | 10578 | 39.0100 | 1.0 |
11.089 | 83.0 | 10707 | 38.5474 | 1.0 |
10.954 | 84.0 | 10836 | 38.9405 | 1.0 |
11.0157 | 85.0 | 10965 | 39.3872 | 1.0 |
10.9849 | 86.0 | 11094 | 39.4875 | 1.0 |
10.5423 | 87.0 | 11223 | 39.1179 | 1.0 |
11.1968 | 88.0 | 11352 | 39.4084 | 1.0 |
10.6376 | 89.0 | 11481 | 39.8218 | 1.0 |
10.7131 | 90.0 | 11610 | 39.2553 | 1.0 |
10.8252 | 91.0 | 11739 | 39.1368 | 1.0 |
10.6456 | 92.0 | 11868 | 38.9194 | 1.0 |
10.8488 | 93.0 | 11997 | 39.5955 | 1.0 |
10.8675 | 94.0 | 12126 | 39.4760 | 1.0 |
10.4757 | 95.0 | 12255 | 40.4844 | 1.0 |
10.3191 | 96.0 | 12384 | 39.0673 | 1.0 |
10.6073 | 97.0 | 12513 | 39.3767 | 1.0 |
10.3038 | 98.0 | 12642 | 39.6969 | 1.0 |
11.0709 | 99.0 | 12771 | 39.9325 | 1.0 |
10.5951 | 100.0 | 12900 | 39.8755 | 1.0 |
Framework versions
- Transformers 4.50.0.dev0
- Pytorch 2.4.1
- Datasets 3.3.2
- Tokenizers 0.21.0
- Downloads last month
- 1
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support