segformer-b5-finetuned-ade20k-morphpadver1-hgo-coord_40epochs_distortion_ver2_global_norm
This model is a fine-tuned version of nvidia/segformer-b5-finetuned-ade-640-640 on the NICOPOI-9/Morphpad_HGO_1600_coord_global_norm dataset. It achieves the following results on the evaluation set:
- Loss: 0.1104
- Mean Iou: 0.9755
- Mean Accuracy: 0.9877
- Overall Accuracy: 0.9875
- Accuracy 0-0: 0.9892
- Accuracy 0-90: 0.9855
- Accuracy 90-0: 0.9863
- Accuracy 90-90: 0.9897
- Iou 0-0: 0.9778
- Iou 0-90: 0.9741
- Iou 90-0: 0.9734
- Iou 90-90: 0.9766
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 40
Training results
Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy 0-0 | Accuracy 0-90 | Accuracy 90-0 | Accuracy 90-90 | Iou 0-0 | Iou 0-90 | Iou 90-0 | Iou 90-90 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1.3668 | 1.3680 | 4000 | 1.2660 | 0.2258 | 0.3802 | 0.3984 | 0.1901 | 0.4693 | 0.6808 | 0.1805 | 0.1537 | 0.2979 | 0.3155 | 0.1360 |
0.8028 | 2.7360 | 8000 | 1.0729 | 0.3341 | 0.5062 | 0.5266 | 0.2943 | 0.7462 | 0.6891 | 0.2953 | 0.2476 | 0.4225 | 0.4118 | 0.2544 |
0.9605 | 4.1040 | 12000 | 0.7966 | 0.5070 | 0.6678 | 0.6795 | 0.5623 | 0.7823 | 0.8018 | 0.5249 | 0.4831 | 0.5564 | 0.5338 | 0.4548 |
0.59 | 5.4720 | 16000 | 0.5049 | 0.6944 | 0.8173 | 0.8197 | 0.7840 | 0.8640 | 0.8169 | 0.8043 | 0.6977 | 0.6983 | 0.6912 | 0.6906 |
0.4359 | 6.8399 | 20000 | 0.3698 | 0.7779 | 0.8755 | 0.8748 | 0.8698 | 0.8828 | 0.8515 | 0.8977 | 0.7808 | 0.7776 | 0.7678 | 0.7855 |
0.3519 | 8.2079 | 24000 | 0.3545 | 0.7831 | 0.8734 | 0.8785 | 0.8256 | 0.9371 | 0.9178 | 0.8129 | 0.7893 | 0.7908 | 0.7801 | 0.7721 |
0.2117 | 9.5759 | 28000 | 0.2449 | 0.8532 | 0.9180 | 0.9205 | 0.8938 | 0.9490 | 0.9396 | 0.8894 | 0.8625 | 0.8530 | 0.8454 | 0.8520 |
0.4507 | 10.9439 | 32000 | 0.2180 | 0.8723 | 0.9293 | 0.9315 | 0.9214 | 0.9632 | 0.9430 | 0.8894 | 0.8865 | 0.8644 | 0.8736 | 0.8645 |
0.1921 | 12.3119 | 36000 | 0.1766 | 0.9018 | 0.9475 | 0.9480 | 0.9375 | 0.9644 | 0.9383 | 0.9499 | 0.9086 | 0.8969 | 0.8936 | 0.9081 |
0.6741 | 13.6799 | 40000 | 0.1713 | 0.9009 | 0.9467 | 0.9478 | 0.9612 | 0.9654 | 0.9563 | 0.9037 | 0.9183 | 0.8975 | 0.9027 | 0.8852 |
0.0979 | 15.0479 | 44000 | 0.1528 | 0.9257 | 0.9610 | 0.9612 | 0.9614 | 0.9568 | 0.9697 | 0.9560 | 0.9335 | 0.9255 | 0.9165 | 0.9273 |
0.2138 | 16.4159 | 48000 | 0.1637 | 0.9177 | 0.9561 | 0.9568 | 0.9494 | 0.9608 | 0.9651 | 0.9492 | 0.9240 | 0.9101 | 0.9153 | 0.9214 |
0.1426 | 17.7839 | 52000 | 0.1263 | 0.9454 | 0.9716 | 0.9718 | 0.9735 | 0.9734 | 0.9737 | 0.9659 | 0.9502 | 0.9438 | 0.9416 | 0.9460 |
0.1079 | 19.1518 | 56000 | 0.1401 | 0.9299 | 0.9625 | 0.9635 | 0.9630 | 0.9705 | 0.9793 | 0.9370 | 0.9403 | 0.9339 | 0.9225 | 0.9227 |
0.0968 | 20.5198 | 60000 | 0.1735 | 0.9231 | 0.9592 | 0.9600 | 0.9447 | 0.9648 | 0.9674 | 0.9600 | 0.9202 | 0.9239 | 0.9185 | 0.9300 |
0.1719 | 21.8878 | 64000 | 0.1326 | 0.9459 | 0.9718 | 0.9720 | 0.9579 | 0.9696 | 0.9751 | 0.9848 | 0.9464 | 0.9406 | 0.9434 | 0.9530 |
0.0587 | 23.2558 | 68000 | 0.1135 | 0.9585 | 0.9791 | 0.9786 | 0.9850 | 0.9710 | 0.9773 | 0.9831 | 0.9635 | 0.9542 | 0.9558 | 0.9603 |
0.2671 | 24.6238 | 72000 | 0.1184 | 0.9548 | 0.9761 | 0.9768 | 0.9655 | 0.9823 | 0.9827 | 0.9742 | 0.9544 | 0.9576 | 0.9485 | 0.9587 |
0.0418 | 25.9918 | 76000 | 0.1169 | 0.9605 | 0.9797 | 0.9797 | 0.9818 | 0.9765 | 0.9836 | 0.9768 | 0.9668 | 0.9574 | 0.9570 | 0.9608 |
0.0587 | 27.3598 | 80000 | 0.1084 | 0.9590 | 0.9786 | 0.9790 | 0.9714 | 0.9810 | 0.9843 | 0.9776 | 0.9600 | 0.9586 | 0.9553 | 0.9623 |
0.0067 | 28.7278 | 84000 | 0.1190 | 0.9641 | 0.9816 | 0.9815 | 0.9826 | 0.9775 | 0.9848 | 0.9815 | 0.9703 | 0.9629 | 0.9590 | 0.9641 |
0.0369 | 30.0958 | 88000 | 0.1230 | 0.9644 | 0.9818 | 0.9818 | 0.9804 | 0.9822 | 0.9811 | 0.9834 | 0.9667 | 0.9613 | 0.9627 | 0.9670 |
0.0814 | 31.4637 | 92000 | 0.1184 | 0.9664 | 0.9828 | 0.9828 | 0.9838 | 0.9827 | 0.9853 | 0.9792 | 0.9688 | 0.9674 | 0.9633 | 0.9661 |
0.1671 | 32.8317 | 96000 | 0.1193 | 0.9676 | 0.9835 | 0.9834 | 0.9839 | 0.9820 | 0.9837 | 0.9845 | 0.9718 | 0.9652 | 0.9661 | 0.9674 |
0.1289 | 34.1997 | 100000 | 0.1201 | 0.9620 | 0.9801 | 0.9806 | 0.9705 | 0.9845 | 0.9842 | 0.9811 | 0.9609 | 0.9588 | 0.9624 | 0.9657 |
0.0551 | 35.5677 | 104000 | 0.1159 | 0.9707 | 0.9851 | 0.9850 | 0.9863 | 0.9831 | 0.9854 | 0.9856 | 0.9741 | 0.9680 | 0.9689 | 0.9717 |
0.0217 | 36.9357 | 108000 | 0.1141 | 0.9732 | 0.9865 | 0.9863 | 0.9886 | 0.9826 | 0.9868 | 0.9879 | 0.9767 | 0.9705 | 0.9722 | 0.9733 |
0.0031 | 38.3037 | 112000 | 0.1177 | 0.9735 | 0.9866 | 0.9865 | 0.9891 | 0.9843 | 0.9879 | 0.9849 | 0.9770 | 0.9724 | 0.9714 | 0.9733 |
0.0435 | 39.6717 | 116000 | 0.1104 | 0.9755 | 0.9877 | 0.9875 | 0.9892 | 0.9855 | 0.9863 | 0.9897 | 0.9778 | 0.9741 | 0.9734 | 0.9766 |
Framework versions
- Transformers 4.48.3
- Pytorch 2.1.0
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 23
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support