Dnq2025's picture
End of training
1008198 verified
|
raw
history blame
9.44 kB
metadata
library_name: transformers
license: other
base_model: facebook/mask2former-swin-base-IN21k-ade-semantic
tags:
  - image-segmentation
  - vision
  - generated_from_trainer
model-index:
  - name: mask2former-finetuned-ER-Mito-LD3
    results: []

mask2former-finetuned-ER-Mito-LD3

This model is a fine-tuned version of facebook/mask2former-swin-base-IN21k-ade-semantic on the Dnq2025/Mask2former_Pretrain dataset. It achieves the following results on the evaluation set:

  • Loss: 33.5405
  • Dummy: 1.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 5
  • eval_batch_size: 5
  • seed: 1337
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: polynomial
  • training_steps: 12900

Training results

Training Loss Epoch Step Validation Loss Dummy
48.5372 1.0 104 36.3338 1.0
33.5327 2.0 208 31.7351 1.0
29.7691 3.0 312 30.4858 1.0
26.3002 4.0 416 28.6091 1.0
24.7501 5.0 520 27.0967 1.0
23.4495 6.0 624 26.6241 1.0
23.274 7.0 728 27.1544 1.0
21.1617 8.0 832 27.4625 1.0
20.373 9.0 936 27.5745 1.0
20.4295 10.0 1040 27.6942 1.0
20.2526 11.0 1144 27.7829 1.0
19.2572 12.0 1248 27.2960 1.0
19.0089 13.0 1352 26.0039 1.0
18.3621 14.0 1456 26.5623 1.0
18.0517 15.0 1560 26.2700 1.0
18.3139 16.0 1664 27.2972 1.0
17.6129 17.0 1768 26.4869 1.0
17.8402 18.0 1872 27.7618 1.0
16.6494 19.0 1976 27.5173 1.0
17.0833 20.0 2080 28.1242 1.0
16.5967 21.0 2184 29.1195 1.0
16.2634 22.0 2288 27.0367 1.0
16.6797 23.0 2392 27.1799 1.0
16.0344 24.0 2496 26.6408 1.0
15.7701 25.0 2600 28.4040 1.0
15.6061 26.0 2704 28.0687 1.0
15.3311 27.0 2808 27.1765 1.0
15.2464 28.0 2912 28.2050 1.0
15.0459 29.0 3016 28.6291 1.0
14.7514 30.0 3120 27.8241 1.0
15.0833 31.0 3224 29.1936 1.0
15.0817 32.0 3328 28.4044 1.0
14.3201 33.0 3432 28.3709 1.0
14.5918 34.0 3536 29.3898 1.0
14.7177 35.0 3640 28.5130 1.0
13.9919 36.0 3744 27.7597 1.0
14.2267 37.0 3848 29.2324 1.0
13.7801 38.0 3952 28.3574 1.0
14.1839 39.0 4056 28.8711 1.0
13.7545 40.0 4160 28.2947 1.0
14.1627 41.0 4264 29.4866 1.0
13.5155 42.0 4368 29.8527 1.0
13.704 43.0 4472 29.4292 1.0
13.6644 44.0 4576 29.2324 1.0
13.2006 45.0 4680 29.5414 1.0
13.1545 46.0 4784 29.6988 1.0
13.5744 47.0 4888 28.9933 1.0
12.8073 48.0 4992 28.9770 1.0
13.3773 49.0 5096 30.3950 1.0
12.9506 50.0 5200 31.2871 1.0
13.0674 51.0 5304 29.5711 1.0
13.1265 52.0 5408 31.0887 1.0
13.1392 53.0 5512 29.8433 1.0
12.6108 54.0 5616 29.6436 1.0
12.7608 55.0 5720 29.8706 1.0
12.8723 56.0 5824 30.0596 1.0
12.5437 57.0 5928 30.1367 1.0
12.1387 58.0 6032 30.4089 1.0
12.948 59.0 6136 30.5375 1.0
12.2869 60.0 6240 32.3827 1.0
12.7717 61.0 6344 30.6397 1.0
12.4924 62.0 6448 30.7005 1.0
12.3031 63.0 6552 29.9865 1.0
12.5575 64.0 6656 31.0697 1.0
11.9496 65.0 6760 31.5794 1.0
12.0462 66.0 6864 31.6537 1.0
12.7167 67.0 6968 30.7411 1.0
11.8595 68.0 7072 30.4970 1.0
11.7458 69.0 7176 30.8332 1.0
12.2058 70.0 7280 32.0951 1.0
12.0874 71.0 7384 32.4695 1.0
11.705 72.0 7488 31.3117 1.0
12.0 73.0 7592 30.6540 1.0
11.9852 74.0 7696 34.2950 1.0
11.7597 75.0 7800 31.6361 1.0
11.8713 76.0 7904 31.1082 1.0
11.705 77.0 8008 31.8714 1.0
11.5474 78.0 8112 31.0299 1.0
11.8387 79.0 8216 31.3672 1.0
11.7057 80.0 8320 31.6435 1.0
11.5656 81.0 8424 31.1940 1.0
11.6578 82.0 8528 31.8184 1.0
11.3049 83.0 8632 31.8668 1.0
11.5542 84.0 8736 32.8192 1.0
11.3942 85.0 8840 30.9723 1.0
11.6955 86.0 8944 31.3487 1.0
11.4862 87.0 9048 32.0451 1.0
11.5867 88.0 9152 31.9769 1.0
11.0975 89.0 9256 31.9721 1.0
11.5126 90.0 9360 35.3877 1.0
11.067 91.0 9464 33.7614 1.0
11.3857 92.0 9568 32.7046 1.0
11.5511 93.0 9672 32.1096 1.0
11.0961 94.0 9776 32.8302 1.0
11.2935 95.0 9880 32.6688 1.0
11.2398 96.0 9984 32.2807 1.0
11.0444 97.0 10088 32.2766 1.0
11.3157 98.0 10192 32.4437 1.0
11.0191 99.0 10296 32.3851 1.0
11.1406 100.0 10400 32.1389 1.0
11.1237 101.0 10504 32.4886 1.0
10.9485 102.0 10608 32.5051 1.0
10.9188 103.0 10712 32.8615 1.0
11.3029 104.0 10816 33.0388 1.0
11.2023 105.0 10920 32.4923 1.0
10.9634 106.0 11024 32.3288 1.0
11.257 107.0 11128 31.8855 1.0
11.0193 108.0 11232 34.0067 1.0
10.6401 109.0 11336 33.2946 1.0
11.0542 110.0 11440 34.0535 1.0
10.888 111.0 11544 32.7206 1.0
10.9706 112.0 11648 33.1238 1.0
11.0075 113.0 11752 32.9882 1.0
10.7895 114.0 11856 32.7985 1.0
10.9181 115.0 11960 32.9143 1.0
10.5938 116.0 12064 33.0722 1.0
10.4932 117.0 12168 34.2366 1.0
10.9761 118.0 12272 33.8880 1.0
10.6918 119.0 12376 34.3289 1.0
10.896 120.0 12480 33.6095 1.0
10.6876 121.0 12584 33.8608 1.0
10.5666 122.0 12688 33.6994 1.0
10.8161 123.0 12792 33.6172 1.0
10.7195 124.0 12896 33.5397 1.0
10.6712 124.0385 12900 33.4906 1.0

Framework versions

  • Transformers 4.50.0.dev0
  • Pytorch 2.4.1
  • Datasets 3.3.2
  • Tokenizers 0.21.0