swin-large-mask2former-finetuned-Mitoliked-Mito-Innertongue2

This model is a fine-tuned version of Dnq2025/swin-large-mask2former-finetuned-ER-Mito-LD8 on the Dnq2025/Mask2former_Finetune dataset. It achieves the following results on the evaluation set:

  • Mean Iou: 0.4460
  • Loss: 40.5938

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0004
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 1337
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: polynomial
  • training_steps: 5632

Training results

Training Loss Epoch Step Mean Iou Validation Loss
43.3313 2.0 176 0.3371 35.6391
31.1848 4.0 352 0.3681 34.4077
26.4868 6.0 528 0.3798 35.5335
23.4201 8.0 704 0.4136 34.2507
21.2972 10.0 880 0.4231 34.8946
19.7902 12.0 1056 0.3975 36.7693
18.2954 14.0 1232 0.4123 35.3214
17.1452 16.0 1408 0.4215 35.4186
16.3828 18.0 1584 0.4043 36.7854
15.5539 20.0 1760 0.4371 35.7144
14.9365 22.0 1936 0.4275 36.8812
14.4327 24.0 2112 0.4051 36.8868
13.6736 26.0 2288 0.4119 37.0440
13.2255 28.0 2464 0.4328 37.2634
12.7617 30.0 2640 0.4252 39.2454
12.3143 32.0 2816 0.4372 37.5345
11.9953 34.0 2992 0.4401 37.8913
11.6438 36.0 3168 0.4501 38.7431
11.3325 38.0 3344 0.4463 37.0448
11.0669 40.0 3520 0.4441 38.5237
10.9012 42.0 3696 0.4313 40.2165
10.6703 44.0 3872 0.4379 40.0815
10.4866 46.0 4048 0.4505 40.1928
10.3071 48.0 4224 0.4421 38.1996
10.144 50.0 4400 0.4473 39.9193
10.0052 52.0 4576 0.4482 38.6817
9.9162 54.0 4752 0.4481 39.9527
9.7836 56.0 4928 0.4498 39.1961
9.6469 58.0 5104 0.4483 40.5048
9.6146 60.0 5280 0.4471 41.2389
9.5224 62.0 5456 0.4455 40.9301
9.4583 64.0 5632 0.4460 40.8836

Framework versions

  • Transformers 4.50.0.dev0
  • Pytorch 2.4.1
  • Datasets 3.3.2
  • Tokenizers 0.21.0
Downloads last month
32
Safetensors
Model size
216M params
Tensor type
I64
·
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Dnq2025/swin-large-mask2former-finetuned-Mitoliked-Mito-Innertongue2