segformer-b2-finetuned-ade-512-512_necrosis

This model is a fine-tuned version of nvidia/segformer-b2-finetuned-ade-512-512 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0547
  • Mean Iou: 0.8851
  • Mean Accuracy: 0.9274
  • Overall Accuracy: 0.9826
  • Accuracy Background: 0.9941
  • Accuracy Necrosis: 0.8203
  • Accuracy Root: 0.9678
  • Iou Background: 0.9889
  • Iou Necrosis: 0.7417
  • Iou Root: 0.9247

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.05
  • num_epochs: 40

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Background Accuracy Necrosis Accuracy Root Iou Background Iou Necrosis Iou Root
1.0136 0.3125 20 0.9745 0.2835 0.5534 0.5117 0.5703 0.8384 0.2516 0.5531 0.0588 0.2387
0.782 0.625 40 0.6546 0.6443 0.7573 0.9244 0.9470 0.3958 0.9292 0.9426 0.1808 0.8096
0.5646 0.9375 60 0.5035 0.6000 0.6673 0.9352 0.9622 0.0591 0.9807 0.9595 0.0417 0.7987
0.4075 1.25 80 0.3676 0.6185 0.6781 0.9491 0.9802 0.0744 0.9797 0.9743 0.0697 0.8114
0.3336 1.5625 100 0.2976 0.6525 0.7111 0.9526 0.9793 0.1703 0.9838 0.9751 0.1626 0.8198
0.3046 1.875 120 0.2017 0.8358 0.9058 0.9716 0.9905 0.7937 0.9334 0.9798 0.6453 0.8823
0.1448 2.1875 140 0.1557 0.8383 0.9006 0.9725 0.9850 0.7537 0.9631 0.9798 0.6465 0.8885
0.1214 2.5 160 0.1194 0.8600 0.9089 0.9773 0.9944 0.7847 0.9475 0.9840 0.6915 0.9044
0.1044 2.8125 180 0.1037 0.8590 0.9012 0.9779 0.9938 0.7523 0.9575 0.9848 0.6852 0.9069
0.0875 3.125 200 0.1002 0.8520 0.8956 0.9769 0.9906 0.7280 0.9681 0.9844 0.6686 0.9031
0.0873 3.4375 220 0.0873 0.8574 0.8968 0.9781 0.9919 0.7293 0.9693 0.9853 0.6787 0.9083
0.0823 3.75 240 0.0876 0.8712 0.9292 0.9789 0.9944 0.8486 0.9447 0.9857 0.7185 0.9094
0.0828 4.0625 260 0.0866 0.8657 0.9290 0.9765 0.9934 0.8578 0.9357 0.9827 0.7143 0.9002
0.0601 4.375 280 0.0774 0.8619 0.9002 0.9787 0.9937 0.7430 0.9638 0.9857 0.6901 0.9100
0.0734 4.6875 300 0.0746 0.8588 0.8964 0.9787 0.9924 0.7261 0.9708 0.9860 0.6798 0.9106
0.1485 5.0 320 0.0693 0.8774 0.9267 0.9804 0.9938 0.8291 0.9571 0.9866 0.7293 0.9164
0.0592 5.3125 340 0.0681 0.8739 0.9184 0.9800 0.9927 0.7982 0.9644 0.9862 0.7202 0.9153
0.0599 5.625 360 0.0665 0.8753 0.9207 0.9804 0.9925 0.8039 0.9657 0.9866 0.7224 0.9169
0.0653 5.9375 380 0.0651 0.8774 0.9304 0.9802 0.9946 0.8461 0.9506 0.9863 0.7301 0.9159
0.0729 6.25 400 0.0635 0.8795 0.9241 0.9812 0.9929 0.8125 0.9670 0.9876 0.7311 0.9197
0.0713 6.5625 420 0.0653 0.8785 0.9273 0.9802 0.9954 0.8376 0.9490 0.9862 0.7346 0.9147
0.0584 6.875 440 0.0619 0.8772 0.9173 0.9807 0.9943 0.7956 0.9619 0.9866 0.7273 0.9177
0.0515 7.1875 460 0.0629 0.8644 0.9005 0.9799 0.9933 0.7369 0.9714 0.9871 0.6912 0.9148
0.0423 7.5 480 0.0594 0.8809 0.9237 0.9815 0.9938 0.8119 0.9653 0.9877 0.7337 0.9212
0.0568 7.8125 500 0.0588 0.8822 0.9369 0.9813 0.9925 0.8564 0.9617 0.9877 0.7387 0.9201
0.0786 8.125 520 0.0587 0.8781 0.9178 0.9814 0.9946 0.7945 0.9644 0.9877 0.7260 0.9205
0.0475 8.4375 540 0.0643 0.8693 0.9098 0.9796 0.9923 0.7688 0.9683 0.9860 0.7081 0.9137
0.0556 8.75 560 0.0571 0.8738 0.9099 0.9812 0.9948 0.7673 0.9677 0.9880 0.7134 0.9199
0.0511 9.0625 580 0.0574 0.8786 0.9199 0.9814 0.9923 0.7945 0.9729 0.9878 0.7273 0.9207
0.0392 9.375 600 0.0571 0.8713 0.9074 0.9807 0.9936 0.7576 0.9711 0.9876 0.7088 0.9176
0.0438 9.6875 620 0.0565 0.8823 0.9326 0.9817 0.9949 0.8461 0.9568 0.9882 0.7374 0.9213
0.157 10.0 640 0.0564 0.8829 0.9292 0.9815 0.9944 0.8337 0.9594 0.9877 0.7411 0.9200
0.0404 10.3125 660 0.0571 0.8814 0.9276 0.9811 0.9957 0.8346 0.9526 0.9870 0.7384 0.9188
0.0447 10.625 680 0.0536 0.8814 0.9250 0.9822 0.9933 0.8113 0.9703 0.9888 0.7316 0.9237
0.0353 10.9375 700 0.0571 0.8774 0.9162 0.9812 0.9934 0.7857 0.9695 0.9875 0.7250 0.9198
0.0488 11.25 720 0.0574 0.8821 0.9344 0.9811 0.9950 0.8563 0.9520 0.9875 0.7401 0.9186
0.0444 11.5625 740 0.0595 0.8784 0.9224 0.9792 0.9957 0.8262 0.9454 0.9843 0.7406 0.9104
0.0452 11.875 760 0.0553 0.8806 0.9365 0.9811 0.9957 0.8664 0.9474 0.9878 0.7361 0.9180
0.0375 12.1875 780 0.0533 0.8812 0.9237 0.9818 0.9918 0.8046 0.9748 0.9881 0.7330 0.9224
0.0364 12.5 800 0.0530 0.8842 0.9276 0.9822 0.9936 0.8217 0.9676 0.9884 0.7405 0.9236
0.031 12.8125 820 0.0542 0.8818 0.9268 0.9815 0.9954 0.8280 0.9571 0.9877 0.7371 0.9206
0.0322 13.125 840 0.0533 0.8841 0.9352 0.9820 0.9939 0.8506 0.9611 0.9886 0.7411 0.9226
0.0343 13.4375 860 0.0543 0.8817 0.9219 0.9820 0.9942 0.8044 0.9672 0.9883 0.7341 0.9225
0.0368 13.75 880 0.0520 0.8848 0.9308 0.9824 0.9942 0.8334 0.9647 0.9889 0.7410 0.9245
0.0297 14.0625 900 0.0535 0.8825 0.9256 0.9821 0.9923 0.8111 0.9735 0.9885 0.7355 0.9234
0.0606 14.375 920 0.0538 0.8800 0.9188 0.9819 0.9939 0.7926 0.9699 0.9885 0.7289 0.9225
0.0429 14.6875 940 0.0535 0.8802 0.9188 0.9823 0.9938 0.7902 0.9724 0.9889 0.7276 0.9241
0.0692 15.0 960 0.0565 0.8813 0.9278 0.9812 0.9898 0.8163 0.9772 0.9873 0.7367 0.9200
0.0359 15.3125 980 0.0535 0.8832 0.9261 0.9820 0.9954 0.8228 0.9600 0.9882 0.7390 0.9224
0.0282 15.625 1000 0.0529 0.8838 0.9240 0.9821 0.9958 0.8160 0.9603 0.9882 0.7399 0.9231
0.038 15.9375 1020 0.0535 0.8808 0.9217 0.9812 0.9946 0.8094 0.9612 0.9872 0.7364 0.9189
0.0355 16.25 1040 0.0536 0.8822 0.9222 0.9824 0.9946 0.8042 0.9677 0.9888 0.7333 0.9244
0.046 16.5625 1060 0.0540 0.8831 0.9248 0.9820 0.9919 0.8074 0.9752 0.9883 0.7378 0.9231
0.0346 16.875 1080 0.0514 0.8851 0.9283 0.9824 0.9937 0.8231 0.9680 0.9886 0.7420 0.9247
0.0355 17.1875 1100 0.0523 0.8844 0.9272 0.9823 0.9947 0.8226 0.9641 0.9886 0.7404 0.9241
0.0317 17.5 1120 0.0517 0.8834 0.9229 0.9826 0.9946 0.8055 0.9686 0.9890 0.7358 0.9253
0.0489 17.8125 1140 0.0526 0.8823 0.9213 0.9824 0.9939 0.7990 0.9711 0.9889 0.7333 0.9246
0.0318 18.125 1160 0.0520 0.8864 0.9314 0.9824 0.9951 0.8384 0.9607 0.9886 0.7464 0.9242
0.0264 18.4375 1180 0.0518 0.8853 0.9300 0.9823 0.9946 0.8329 0.9626 0.9885 0.7439 0.9235
0.036 18.75 1200 0.0524 0.8821 0.9200 0.9826 0.9947 0.7958 0.9696 0.9890 0.7320 0.9253
0.0288 19.0625 1220 0.0540 0.8794 0.9167 0.9821 0.9933 0.7818 0.9748 0.9888 0.7258 0.9235
0.0304 19.375 1240 0.0530 0.8833 0.9230 0.9821 0.9955 0.8111 0.9623 0.9883 0.7384 0.9230
0.0363 19.6875 1260 0.0530 0.8838 0.9237 0.9823 0.9951 0.8115 0.9644 0.9885 0.7390 0.9238
0.0371 20.0 1280 0.0518 0.8861 0.9279 0.9828 0.9940 0.8206 0.9692 0.9891 0.7434 0.9259
0.0253 20.3125 1300 0.0541 0.8829 0.9226 0.9824 0.9935 0.8023 0.9720 0.9888 0.7356 0.9245
0.0296 20.625 1320 0.0533 0.8861 0.9321 0.9824 0.9932 0.8351 0.9681 0.9887 0.7454 0.9243
0.0306 20.9375 1340 0.0521 0.8842 0.9254 0.9826 0.9936 0.8112 0.9713 0.9891 0.7381 0.9253
0.0341 21.25 1360 0.0530 0.8828 0.9217 0.9825 0.9939 0.8001 0.9712 0.9889 0.7347 0.9247
0.0215 21.5625 1380 0.0537 0.8840 0.9355 0.9817 0.9954 0.8581 0.9529 0.9881 0.7432 0.9206
0.033 21.875 1400 0.0517 0.8868 0.9319 0.9827 0.9944 0.8369 0.9645 0.9890 0.7462 0.9252
0.0284 22.1875 1420 0.0530 0.8840 0.9242 0.9825 0.9938 0.8083 0.9706 0.9889 0.7381 0.9249
0.0238 22.5 1440 0.0518 0.8864 0.9335 0.9826 0.9949 0.8443 0.9613 0.9890 0.7456 0.9247
0.0222 22.8125 1460 0.0541 0.8814 0.9211 0.9823 0.9924 0.7942 0.9766 0.9889 0.7314 0.9240
0.0263 23.125 1480 0.0528 0.8851 0.9273 0.9826 0.9941 0.8200 0.9677 0.9889 0.7414 0.9249
0.0246 23.4375 1500 0.0532 0.8858 0.9317 0.9825 0.9935 0.8343 0.9673 0.9889 0.7437 0.9247
0.0382 23.75 1520 0.0548 0.8835 0.9276 0.9819 0.9913 0.8164 0.9750 0.9881 0.7399 0.9223
0.02 24.0625 1540 0.0537 0.8845 0.9271 0.9824 0.9926 0.8158 0.9729 0.9887 0.7406 0.9242
0.0293 24.375 1560 0.0539 0.8854 0.9300 0.9824 0.9927 0.8261 0.9711 0.9887 0.7433 0.9242
0.0277 24.6875 1580 0.0533 0.8854 0.9303 0.9824 0.9929 0.8282 0.9698 0.9887 0.7434 0.9241
0.0225 25.0 1600 0.0534 0.8854 0.9368 0.9823 0.9937 0.8543 0.9625 0.9889 0.7438 0.9235
0.0349 25.3125 1620 0.0535 0.8851 0.9260 0.9827 0.9942 0.8153 0.9686 0.9890 0.7411 0.9252
0.0258 25.625 1640 0.0527 0.8853 0.9279 0.9826 0.9938 0.8212 0.9686 0.9889 0.7423 0.9248
0.033 25.9375 1660 0.0522 0.8860 0.9312 0.9826 0.9951 0.8368 0.9618 0.9889 0.7445 0.9247
0.0202 26.25 1680 0.0518 0.8866 0.9307 0.9828 0.9946 0.8325 0.9649 0.9891 0.7453 0.9255
0.0246 26.5625 1700 0.0530 0.8863 0.9369 0.9825 0.9936 0.8535 0.9637 0.9890 0.7457 0.9242
0.0211 26.875 1720 0.0531 0.8859 0.9289 0.9827 0.9938 0.8240 0.9690 0.9892 0.7429 0.9255
0.0417 27.1875 1740 0.0525 0.8862 0.9296 0.9828 0.9935 0.8254 0.9700 0.9891 0.7437 0.9257
0.0392 27.5 1760 0.0522 0.8868 0.9333 0.9828 0.9939 0.8397 0.9662 0.9892 0.7457 0.9256
0.0248 27.8125 1780 0.0531 0.8867 0.9329 0.9827 0.9943 0.8399 0.9645 0.9891 0.7461 0.9251
0.0255 28.125 1800 0.0540 0.8862 0.9329 0.9825 0.9934 0.8381 0.9673 0.9889 0.7449 0.9247
0.0233 28.4375 1820 0.0537 0.8858 0.9296 0.9826 0.9931 0.8251 0.9704 0.9889 0.7435 0.9248
0.0307 28.75 1840 0.0531 0.8865 0.9299 0.9827 0.9944 0.8291 0.9662 0.9891 0.7450 0.9254
0.0308 29.0625 1860 0.0536 0.8867 0.9329 0.9827 0.9939 0.8389 0.9660 0.9890 0.7459 0.9251
0.0259 29.375 1880 0.0540 0.8850 0.9262 0.9825 0.9945 0.8178 0.9664 0.9888 0.7416 0.9245
0.0295 29.6875 1900 0.0545 0.8838 0.9244 0.9824 0.9937 0.8093 0.9703 0.9888 0.7382 0.9243
0.0197 30.0 1920 0.0539 0.8853 0.9285 0.9825 0.9938 0.8235 0.9683 0.9889 0.7425 0.9247
0.0369 30.3125 1940 0.0539 0.8846 0.9269 0.9824 0.9942 0.8195 0.9668 0.9888 0.7407 0.9242
0.0262 30.625 1960 0.0543 0.8849 0.9287 0.9824 0.9936 0.8241 0.9683 0.9889 0.7415 0.9242
0.0295 30.9375 1980 0.0547 0.8845 0.9269 0.9825 0.9932 0.8162 0.9714 0.9889 0.7400 0.9246
0.0247 31.25 2000 0.0550 0.8855 0.9296 0.9824 0.9943 0.8296 0.9649 0.9887 0.7440 0.9239
0.0283 31.5625 2020 0.0552 0.8828 0.9222 0.9823 0.9939 0.8023 0.9705 0.9888 0.7358 0.9240
0.0333 31.875 2040 0.0543 0.8857 0.9303 0.9825 0.9940 0.8308 0.9660 0.9888 0.7439 0.9244
0.0256 32.1875 2060 0.0540 0.8860 0.9365 0.9824 0.9941 0.8535 0.9617 0.9890 0.7450 0.9239
0.0237 32.5 2080 0.0539 0.8846 0.9241 0.9827 0.9943 0.8083 0.9697 0.9891 0.7390 0.9256
0.0236 32.8125 2100 0.0537 0.8855 0.9276 0.9827 0.9937 0.8187 0.9703 0.9891 0.7417 0.9256
0.0238 33.125 2120 0.0539 0.8849 0.9265 0.9825 0.9947 0.8191 0.9659 0.9889 0.7409 0.9248
0.0265 33.4375 2140 0.0543 0.8858 0.9316 0.9825 0.9938 0.8344 0.9664 0.9889 0.7438 0.9246
0.0274 33.75 2160 0.0555 0.8826 0.9225 0.9824 0.9939 0.8029 0.9706 0.9890 0.7344 0.9245
0.0232 34.0625 2180 0.0543 0.8857 0.9316 0.9826 0.9935 0.8336 0.9677 0.9890 0.7434 0.9248
0.0276 34.375 2200 0.0547 0.8838 0.9240 0.9826 0.9941 0.8082 0.9697 0.9891 0.7373 0.9251
0.033 34.6875 2220 0.0538 0.8851 0.9267 0.9826 0.9948 0.8198 0.9657 0.9890 0.7413 0.9251
0.0333 35.0 2240 0.0540 0.8857 0.9291 0.9827 0.9937 0.8247 0.9690 0.9891 0.7426 0.9254
0.0221 35.3125 2260 0.0545 0.8856 0.9291 0.9826 0.9941 0.8260 0.9674 0.9891 0.7426 0.9251
0.0286 35.625 2280 0.0549 0.8852 0.9292 0.9824 0.9940 0.8275 0.9661 0.9887 0.7428 0.9240
0.0231 35.9375 2300 0.0545 0.8855 0.9288 0.9826 0.9941 0.8251 0.9673 0.9890 0.7425 0.9250
0.0301 36.25 2320 0.0544 0.8853 0.9284 0.9825 0.9946 0.8258 0.9650 0.9888 0.7425 0.9245
0.0311 36.5625 2340 0.0545 0.8853 0.9289 0.9826 0.9937 0.8245 0.9685 0.9889 0.7422 0.9248
0.0231 36.875 2360 0.0548 0.8854 0.9284 0.9825 0.9945 0.8257 0.9650 0.9888 0.7430 0.9243
0.0187 37.1875 2380 0.0548 0.8859 0.9313 0.9826 0.9941 0.8342 0.9656 0.9890 0.7441 0.9247
0.0355 37.5 2400 0.0550 0.8846 0.9261 0.9825 0.9945 0.8173 0.9665 0.9889 0.7405 0.9244
0.021 37.8125 2420 0.0547 0.8857 0.9300 0.9825 0.9940 0.8295 0.9664 0.9889 0.7436 0.9246
0.0274 38.125 2440 0.0545 0.8854 0.9285 0.9826 0.9940 0.8240 0.9676 0.9890 0.7423 0.9249
0.0288 38.4375 2460 0.0545 0.8849 0.9270 0.9826 0.9941 0.8188 0.9682 0.9890 0.7408 0.9250
0.0315 38.75 2480 0.0548 0.8847 0.9260 0.9826 0.9942 0.8158 0.9681 0.9890 0.7404 0.9248
0.0221 39.0625 2500 0.0550 0.8858 0.9295 0.9826 0.9941 0.8276 0.9668 0.9890 0.7435 0.9248
0.021 39.375 2520 0.0552 0.8855 0.9290 0.9826 0.9940 0.8255 0.9674 0.9889 0.7429 0.9248
0.0261 39.6875 2540 0.0544 0.8852 0.9274 0.9826 0.9942 0.8208 0.9673 0.9889 0.7419 0.9248
0.0152 40.0 2560 0.0547 0.8851 0.9274 0.9826 0.9941 0.8203 0.9678 0.9889 0.7417 0.9247

Framework versions

  • Transformers 4.49.0
  • Pytorch 2.6.0+cu124
  • Datasets 3.3.2
  • Tokenizers 0.21.0
Downloads last month
0
Safetensors
Model size
27.4M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for mujerry/segformer-b2-finetuned-ade-512-512_necrosis

Finetuned
(9)
this model