segformer-b5-finetuned-morphpadver1-hgo-coord-v9_mix_resample_40epochs
This model is a fine-tuned version of nvidia/mit-b5 on the NICOPOI-9/morphpad_coord_hgo_512_4class_v2 dataset. It achieves the following results on the evaluation set:
- Loss: 0.6422
- Mean Iou: 0.7347
- Mean Accuracy: 0.8436
- Overall Accuracy: 0.8478
- Accuracy 0-0: 0.7840
- Accuracy 0-90: 0.8932
- Accuracy 90-0: 0.8779
- Accuracy 90-90: 0.8194
- Iou 0-0: 0.7200
- Iou 0-90: 0.7429
- Iou 90-0: 0.7413
- Iou 90-90: 0.7348
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 40
Training results
Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy 0-0 | Accuracy 0-90 | Accuracy 90-0 | Accuracy 90-90 | Iou 0-0 | Iou 0-90 | Iou 90-0 | Iou 90-90 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1.3684 | 1.3638 | 4000 | 1.3355 | 0.1756 | 0.3310 | 0.3524 | 0.1001 | 0.3549 | 0.7755 | 0.0937 | 0.0866 | 0.2432 | 0.2910 | 0.0815 |
0.6699 | 2.7276 | 8000 | 1.1446 | 0.3199 | 0.4833 | 0.4927 | 0.3842 | 0.5642 | 0.5994 | 0.3853 | 0.2808 | 0.3636 | 0.3545 | 0.2808 |
0.7075 | 4.0914 | 12000 | 1.0016 | 0.4314 | 0.6036 | 0.6040 | 0.6056 | 0.5476 | 0.6773 | 0.5840 | 0.4196 | 0.4287 | 0.4540 | 0.4233 |
0.4978 | 5.4552 | 16000 | 0.8600 | 0.5036 | 0.6641 | 0.6709 | 0.5657 | 0.7738 | 0.6865 | 0.6305 | 0.4895 | 0.5093 | 0.5085 | 0.5071 |
0.6194 | 6.8190 | 20000 | 0.8307 | 0.5419 | 0.7019 | 0.7030 | 0.6992 | 0.7035 | 0.7266 | 0.6782 | 0.5500 | 0.5426 | 0.5441 | 0.5311 |
0.2622 | 8.1827 | 24000 | 0.7177 | 0.5987 | 0.7469 | 0.7491 | 0.7321 | 0.7660 | 0.7749 | 0.7148 | 0.5962 | 0.5985 | 0.6019 | 0.5984 |
0.9683 | 9.5465 | 28000 | 0.7541 | 0.5951 | 0.7409 | 0.7474 | 0.6970 | 0.7901 | 0.8298 | 0.6467 | 0.5996 | 0.6063 | 0.6010 | 0.5736 |
0.2542 | 10.9103 | 32000 | 0.7039 | 0.6381 | 0.7768 | 0.7791 | 0.7554 | 0.7818 | 0.8233 | 0.7465 | 0.6333 | 0.6498 | 0.6295 | 0.6400 |
0.1691 | 12.2741 | 36000 | 0.6232 | 0.6632 | 0.7959 | 0.7978 | 0.7673 | 0.8280 | 0.8005 | 0.7877 | 0.6636 | 0.6778 | 0.6536 | 0.6580 |
0.1883 | 13.6379 | 40000 | 0.6711 | 0.6649 | 0.7948 | 0.7996 | 0.7315 | 0.8565 | 0.8291 | 0.7622 | 0.6514 | 0.6774 | 0.6675 | 0.6632 |
0.164 | 15.0017 | 44000 | 0.6627 | 0.6688 | 0.7980 | 0.8022 | 0.7670 | 0.8227 | 0.8637 | 0.7386 | 0.6740 | 0.6846 | 0.6687 | 0.6479 |
0.2406 | 16.3655 | 48000 | 0.6364 | 0.6930 | 0.8159 | 0.8194 | 0.7843 | 0.8565 | 0.8466 | 0.7763 | 0.6894 | 0.7005 | 0.7017 | 0.6805 |
0.109 | 17.7293 | 52000 | 0.6087 | 0.6872 | 0.8119 | 0.8153 | 0.7622 | 0.8473 | 0.8443 | 0.7940 | 0.6733 | 0.7055 | 0.6834 | 0.6868 |
0.1262 | 19.0931 | 56000 | 0.6101 | 0.6999 | 0.8202 | 0.8240 | 0.7795 | 0.8572 | 0.8619 | 0.7823 | 0.6912 | 0.7071 | 0.7041 | 0.6972 |
0.1633 | 20.4569 | 60000 | 0.6434 | 0.7056 | 0.8239 | 0.8280 | 0.7832 | 0.8548 | 0.8795 | 0.7781 | 0.7006 | 0.7194 | 0.7056 | 0.6969 |
8.0069 | 21.8207 | 64000 | 0.5640 | 0.7111 | 0.8286 | 0.8319 | 0.8192 | 0.8718 | 0.8567 | 0.7666 | 0.7149 | 0.7267 | 0.7119 | 0.6909 |
0.0335 | 23.1845 | 68000 | 0.5820 | 0.7215 | 0.8348 | 0.8388 | 0.7894 | 0.8701 | 0.8828 | 0.7967 | 0.7085 | 0.7253 | 0.7293 | 0.7230 |
0.3274 | 24.5482 | 72000 | 0.6041 | 0.7210 | 0.8346 | 0.8386 | 0.7929 | 0.8767 | 0.8754 | 0.7933 | 0.7157 | 0.7344 | 0.7223 | 0.7117 |
0.137 | 25.9120 | 76000 | 0.6174 | 0.7000 | 0.8197 | 0.8246 | 0.7584 | 0.8768 | 0.8609 | 0.7829 | 0.6897 | 0.7138 | 0.7079 | 0.6886 |
0.0973 | 27.2758 | 80000 | 0.6329 | 0.7039 | 0.8208 | 0.8276 | 0.7483 | 0.9023 | 0.8783 | 0.7542 | 0.6896 | 0.7184 | 0.7139 | 0.6936 |
0.0938 | 28.6396 | 84000 | 0.5952 | 0.7273 | 0.8412 | 0.8421 | 0.8351 | 0.8486 | 0.8535 | 0.8276 | 0.7316 | 0.7346 | 0.7211 | 0.7218 |
0.0558 | 30.0034 | 88000 | 0.6204 | 0.7017 | 0.8193 | 0.8260 | 0.7578 | 0.8958 | 0.8822 | 0.7412 | 0.6987 | 0.7170 | 0.7090 | 0.6822 |
0.0048 | 31.3672 | 92000 | 0.6403 | 0.7057 | 0.8219 | 0.8289 | 0.7465 | 0.8980 | 0.8915 | 0.7514 | 0.6933 | 0.7244 | 0.7130 | 0.6920 |
0.0134 | 32.7310 | 96000 | 0.6758 | 0.7192 | 0.8333 | 0.8375 | 0.8073 | 0.8800 | 0.8754 | 0.7704 | 0.7150 | 0.7309 | 0.7256 | 0.7052 |
0.0326 | 34.0948 | 100000 | 0.6023 | 0.7256 | 0.8362 | 0.8419 | 0.7617 | 0.9030 | 0.8856 | 0.7944 | 0.7094 | 0.7337 | 0.7341 | 0.7254 |
0.0379 | 35.4586 | 104000 | 0.6208 | 0.7342 | 0.8436 | 0.8472 | 0.7932 | 0.8949 | 0.8641 | 0.8220 | 0.7245 | 0.7368 | 0.7419 | 0.7334 |
0.0676 | 36.8224 | 108000 | 0.6448 | 0.7298 | 0.8401 | 0.8446 | 0.7877 | 0.8896 | 0.8831 | 0.8001 | 0.7180 | 0.7440 | 0.7336 | 0.7236 |
0.0604 | 38.1862 | 112000 | 0.6608 | 0.7334 | 0.8426 | 0.8468 | 0.7867 | 0.8845 | 0.8869 | 0.8124 | 0.7214 | 0.7441 | 0.7364 | 0.7315 |
0.018 | 39.5499 | 116000 | 0.6422 | 0.7347 | 0.8436 | 0.8478 | 0.7840 | 0.8932 | 0.8779 | 0.8194 | 0.7200 | 0.7429 | 0.7413 | 0.7348 |
Framework versions
- Transformers 4.48.3
- Pytorch 2.1.0
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 6
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for NICOPOI-9/segformer-b5-finetuned-morphpadver1-hgo-coord-v9_mix_resample_40epochs
Base model
nvidia/mit-b5