tf_segformer_for_semantic_segmentation_v3
This model is a fine-tuned version of nvidia/mit-b0 on an unknown dataset. It achieves the following results on the evaluation set:
- Train Loss: 0.2414
- Validation Loss: 0.2879
- Validation Mean Iou: 0.6578
- Validation Mean Accuracy: 0.8810
- Validation Overall Accuracy: 0.9074
- Validation Accuracy Unlabeled: nan
- Validation Accuracy Building: 0.9097
- Validation Accuracy Land: 0.9396
- Validation Accuracy Road: 0.8000
- Validation Accuracy Vegetation: 0.8165
- Validation Accuracy Water: 0.9392
- Validation Iou Unlabeled: 0.0
- Validation Iou Building: 0.7772
- Validation Iou Land: 0.9028
- Validation Iou Road: 0.6828
- Validation Iou Vegetation: 0.6714
- Validation Iou Water: 0.9125
- Epoch: 49
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': False, 'is_legacy_optimizer': False, 'learning_rate': 6e-05, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: float32
Training results
Train Loss | Validation Loss | Validation Mean Iou | Validation Mean Accuracy | Validation Overall Accuracy | Validation Accuracy Unlabeled | Validation Accuracy Building | Validation Accuracy Land | Validation Accuracy Road | Validation Accuracy Vegetation | Validation Accuracy Water | Validation Iou Unlabeled | Validation Iou Building | Validation Iou Land | Validation Iou Road | Validation Iou Vegetation | Validation Iou Water | Epoch |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1.4526 | 0.8367 | 0.4904 | 0.7407 | 0.7665 | nan | 0.8805 | 0.8041 | 0.4586 | 0.6741 | 0.8860 | 0.0 | 0.5858 | 0.7562 | 0.3475 | 0.4603 | 0.7926 | 0 |
0.9142 | 0.8189 | 0.4817 | 0.7306 | 0.7891 | nan | 0.8061 | 0.8595 | 0.5010 | 0.7384 | 0.7477 | 0.0 | 0.6034 | 0.8033 | 0.3891 | 0.4754 | 0.6193 | 1 |
0.8439 | 0.6237 | 0.5236 | 0.7804 | 0.8064 | nan | 0.8191 | 0.8573 | 0.6912 | 0.6527 | 0.8817 | 0.0 | 0.6321 | 0.8009 | 0.4659 | 0.5053 | 0.7374 | 2 |
0.7108 | 0.4985 | 0.5541 | 0.7800 | 0.8242 | nan | 0.8262 | 0.9105 | 0.6019 | 0.6743 | 0.8872 | 0.0 | 0.6613 | 0.7989 | 0.4979 | 0.5621 | 0.8042 | 3 |
0.6197 | 0.5273 | 0.5573 | 0.7976 | 0.8330 | nan | 0.8263 | 0.9030 | 0.7527 | 0.5885 | 0.9176 | 0.0 | 0.6898 | 0.8121 | 0.5267 | 0.5201 | 0.7950 | 4 |
0.6202 | 0.4948 | 0.5699 | 0.8110 | 0.8453 | nan | 0.8221 | 0.8921 | 0.7786 | 0.6705 | 0.8913 | 0.0 | 0.6880 | 0.8297 | 0.5585 | 0.5145 | 0.8285 | 5 |
0.5454 | 0.4675 | 0.5851 | 0.8227 | 0.8442 | nan | 0.8582 | 0.8823 | 0.8103 | 0.6061 | 0.9564 | 0.0 | 0.6922 | 0.8086 | 0.5854 | 0.5328 | 0.8918 | 6 |
0.5566 | 0.4371 | 0.5819 | 0.8162 | 0.8512 | nan | 0.8316 | 0.9073 | 0.7572 | 0.6785 | 0.9065 | 0.0 | 0.6899 | 0.8392 | 0.5751 | 0.5413 | 0.8461 | 7 |
0.5456 | 0.4071 | 0.6949 | 0.8177 | 0.8394 | nan | 0.8253 | 0.8993 | 0.7893 | 0.6329 | 0.9419 | nan | 0.6964 | 0.8189 | 0.5692 | 0.5349 | 0.8549 | 8 |
0.4690 | 0.4223 | 0.7227 | 0.8386 | 0.8695 | nan | 0.8670 | 0.9101 | 0.7523 | 0.7607 | 0.9030 | nan | 0.6947 | 0.8547 | 0.5985 | 0.6205 | 0.8450 | 9 |
0.4529 | 0.4232 | 0.7150 | 0.8162 | 0.8821 | nan | 0.8292 | 0.9446 | 0.7629 | 0.5798 | 0.9642 | nan | 0.7179 | 0.8763 | 0.5976 | 0.5008 | 0.8824 | 10 |
0.4981 | 0.4245 | 0.7228 | 0.8380 | 0.8533 | nan | 0.8670 | 0.8952 | 0.7708 | 0.7406 | 0.9164 | nan | 0.7185 | 0.8274 | 0.6025 | 0.6131 | 0.8525 | 11 |
0.4228 | 0.4220 | 0.7169 | 0.8353 | 0.8518 | nan | 0.8611 | 0.8920 | 0.7923 | 0.6990 | 0.9322 | nan | 0.7122 | 0.8267 | 0.5888 | 0.5889 | 0.8679 | 12 |
0.4255 | 0.3802 | 0.6233 | 0.8495 | 0.8762 | nan | 0.8466 | 0.9237 | 0.7867 | 0.7465 | 0.9441 | 0.0 | 0.7233 | 0.8528 | 0.6567 | 0.6183 | 0.8886 | 13 |
0.4457 | 0.4365 | 0.7254 | 0.8327 | 0.8693 | nan | 0.8500 | 0.9243 | 0.8040 | 0.6128 | 0.9722 | nan | 0.7290 | 0.8538 | 0.6080 | 0.5445 | 0.8917 | 14 |
0.4173 | 0.3841 | 0.6104 | 0.8470 | 0.8702 | nan | 0.8736 | 0.9048 | 0.7940 | 0.7135 | 0.9493 | 0.0 | 0.7304 | 0.8491 | 0.6228 | 0.5789 | 0.8812 | 15 |
0.3853 | 0.3850 | 0.7315 | 0.8385 | 0.8794 | nan | 0.8490 | 0.9274 | 0.8012 | 0.6797 | 0.9349 | nan | 0.7278 | 0.8684 | 0.6104 | 0.5668 | 0.8840 | 16 |
0.3878 | 0.3584 | 0.6125 | 0.8333 | 0.8776 | nan | 0.8501 | 0.9372 | 0.7605 | 0.6868 | 0.9320 | 0.0 | 0.7171 | 0.8644 | 0.6342 | 0.5848 | 0.8748 | 17 |
0.3712 | 0.3656 | 0.7346 | 0.8374 | 0.8701 | nan | 0.8502 | 0.9288 | 0.8080 | 0.6459 | 0.9539 | nan | 0.7293 | 0.8599 | 0.6076 | 0.5570 | 0.9193 | 18 |
0.3599 | 0.3579 | 0.6272 | 0.8596 | 0.8796 | nan | 0.8910 | 0.9121 | 0.8196 | 0.7660 | 0.9093 | 0.0 | 0.7358 | 0.8709 | 0.6404 | 0.6403 | 0.8758 | 19 |
0.3553 | 0.3772 | 0.7371 | 0.8366 | 0.8702 | nan | 0.8727 | 0.9261 | 0.7609 | 0.7066 | 0.9165 | nan | 0.7314 | 0.8508 | 0.6217 | 0.6103 | 0.8715 | 20 |
0.3575 | 0.3076 | 0.7509 | 0.8477 | 0.8782 | nan | 0.8486 | 0.9322 | 0.7724 | 0.7382 | 0.9472 | nan | 0.7268 | 0.8575 | 0.6197 | 0.6427 | 0.9080 | 21 |
0.3574 | 0.3587 | 0.6171 | 0.8376 | 0.8730 | nan | 0.8116 | 0.9386 | 0.8013 | 0.7115 | 0.9252 | 0.0 | 0.7260 | 0.8539 | 0.6531 | 0.5915 | 0.8783 | 22 |
0.3441 | 0.3546 | 0.6244 | 0.8566 | 0.8773 | nan | 0.8605 | 0.9106 | 0.8081 | 0.7666 | 0.9372 | 0.0 | 0.7214 | 0.8583 | 0.6324 | 0.6287 | 0.9054 | 23 |
0.3272 | 0.3879 | 0.6356 | 0.8645 | 0.8830 | nan | 0.8536 | 0.9187 | 0.7823 | 0.8268 | 0.9412 | 0.0 | 0.7377 | 0.8671 | 0.6367 | 0.6561 | 0.9160 | 24 |
0.3349 | 0.3440 | 0.6224 | 0.8474 | 0.8776 | nan | 0.8202 | 0.9365 | 0.7851 | 0.7852 | 0.9101 | 0.0 | 0.7372 | 0.8732 | 0.6418 | 0.6020 | 0.8802 | 25 |
0.3297 | 0.3825 | 0.6282 | 0.8582 | 0.8791 | nan | 0.8002 | 0.9170 | 0.7873 | 0.8529 | 0.9334 | 0.0 | 0.7102 | 0.8569 | 0.6205 | 0.6699 | 0.9117 | 26 |
0.3612 | 0.3368 | 0.6213 | 0.8429 | 0.8749 | nan | 0.8787 | 0.9287 | 0.7290 | 0.7225 | 0.9557 | 0.0 | 0.7496 | 0.8513 | 0.6246 | 0.6033 | 0.8989 | 27 |
0.3062 | 0.3057 | 0.6322 | 0.8566 | 0.8824 | nan | 0.9034 | 0.9204 | 0.7822 | 0.7184 | 0.9585 | 0.0 | 0.7574 | 0.8629 | 0.6293 | 0.6130 | 0.9304 | 28 |
0.3014 | 0.3295 | 0.6323 | 0.8528 | 0.8867 | nan | 0.8591 | 0.9423 | 0.7520 | 0.7583 | 0.9520 | 0.0 | 0.7497 | 0.8783 | 0.6324 | 0.6216 | 0.9117 | 29 |
0.3040 | 0.2963 | 0.6321 | 0.8604 | 0.8738 | nan | 0.8959 | 0.9064 | 0.7845 | 0.7703 | 0.9451 | 0.0 | 0.7565 | 0.8413 | 0.6466 | 0.6575 | 0.8904 | 30 |
0.2993 | 0.3428 | 0.6398 | 0.8676 | 0.8839 | nan | 0.8966 | 0.9122 | 0.8000 | 0.8014 | 0.9280 | 0.0 | 0.7597 | 0.8636 | 0.6597 | 0.6551 | 0.9005 | 31 |
0.3216 | 0.3200 | 0.6399 | 0.8620 | 0.8920 | nan | 0.8787 | 0.9374 | 0.7806 | 0.7470 | 0.9660 | 0.0 | 0.7557 | 0.8794 | 0.6513 | 0.6330 | 0.9199 | 32 |
0.3160 | 0.3261 | 0.6219 | 0.8580 | 0.8768 | nan | 0.8980 | 0.9022 | 0.8299 | 0.7000 | 0.9600 | 0.0 | 0.7370 | 0.8564 | 0.6536 | 0.5784 | 0.9062 | 33 |
0.3045 | 0.3550 | 0.6462 | 0.8639 | 0.9011 | nan | 0.8732 | 0.9450 | 0.8075 | 0.7252 | 0.9686 | 0.0 | 0.7674 | 0.8872 | 0.6693 | 0.6262 | 0.9273 | 34 |
0.2839 | 0.3102 | 0.6421 | 0.8618 | 0.8895 | nan | 0.9091 | 0.9314 | 0.7588 | 0.7451 | 0.9646 | 0.0 | 0.7695 | 0.8729 | 0.6418 | 0.6417 | 0.9264 | 35 |
0.3003 | 0.2942 | 0.6253 | 0.8502 | 0.8824 | nan | 0.9090 | 0.9207 | 0.8180 | 0.6335 | 0.9701 | 0.0 | 0.7566 | 0.8691 | 0.6745 | 0.5433 | 0.9081 | 36 |
0.2726 | 0.3068 | 0.6369 | 0.8642 | 0.8880 | nan | 0.9063 | 0.9213 | 0.8244 | 0.7194 | 0.9498 | 0.0 | 0.7534 | 0.8755 | 0.6625 | 0.6225 | 0.9077 | 37 |
0.2700 | 0.3095 | 0.6421 | 0.8645 | 0.8818 | nan | 0.8651 | 0.9209 | 0.7934 | 0.7898 | 0.9532 | 0.0 | 0.7606 | 0.8526 | 0.6613 | 0.6499 | 0.9279 | 38 |
0.2835 | 0.2986 | 0.6391 | 0.8598 | 0.8906 | nan | 0.8870 | 0.9343 | 0.7945 | 0.7351 | 0.9481 | 0.0 | 0.7588 | 0.8776 | 0.6573 | 0.6247 | 0.9160 | 39 |
0.2746 | 0.2812 | 0.6401 | 0.8652 | 0.8895 | nan | 0.8654 | 0.9247 | 0.7898 | 0.7820 | 0.9642 | 0.0 | 0.7451 | 0.8659 | 0.6704 | 0.6198 | 0.9394 | 40 |
0.2866 | 0.3194 | 0.6506 | 0.8679 | 0.8943 | nan | 0.8787 | 0.9420 | 0.8137 | 0.7722 | 0.9330 | 0.0 | 0.7775 | 0.8787 | 0.6729 | 0.6738 | 0.9005 | 41 |
0.2641 | 0.3372 | 0.6415 | 0.8745 | 0.8794 | nan | 0.8758 | 0.9017 | 0.7996 | 0.8507 | 0.9446 | 0.0 | 0.7665 | 0.8619 | 0.6275 | 0.6932 | 0.9001 | 42 |
0.2396 | 0.3112 | 0.6466 | 0.8604 | 0.8889 | nan | 0.8734 | 0.9432 | 0.7511 | 0.7938 | 0.9403 | 0.0 | 0.7722 | 0.8680 | 0.6424 | 0.6795 | 0.9173 | 43 |
0.2566 | 0.3336 | 0.6475 | 0.8762 | 0.8850 | nan | 0.8915 | 0.9044 | 0.7995 | 0.8637 | 0.9222 | 0.0 | 0.7743 | 0.8640 | 0.6572 | 0.6878 | 0.9016 | 44 |
0.2612 | 0.3152 | 0.6522 | 0.8776 | 0.8978 | nan | 0.8580 | 0.9306 | 0.8161 | 0.8183 | 0.9651 | 0.0 | 0.7516 | 0.8810 | 0.6798 | 0.6604 | 0.9404 | 45 |
0.2442 | 0.2982 | 0.6471 | 0.8749 | 0.8923 | nan | 0.8858 | 0.9214 | 0.8028 | 0.8284 | 0.9363 | 0.0 | 0.7642 | 0.8778 | 0.6724 | 0.6602 | 0.9078 | 46 |
0.2418 | 0.3066 | 0.6460 | 0.8689 | 0.8871 | nan | 0.8999 | 0.9207 | 0.7802 | 0.8186 | 0.9248 | 0.0 | 0.7754 | 0.8710 | 0.6689 | 0.6587 | 0.9020 | 47 |
0.2410 | 0.2876 | 0.6493 | 0.8697 | 0.8940 | nan | 0.8915 | 0.9324 | 0.7913 | 0.7950 | 0.9382 | 0.0 | 0.7792 | 0.8757 | 0.6665 | 0.6756 | 0.8986 | 48 |
0.2414 | 0.2879 | 0.6578 | 0.8810 | 0.9074 | nan | 0.9097 | 0.9396 | 0.8000 | 0.8165 | 0.9392 | 0.0 | 0.7772 | 0.9028 | 0.6828 | 0.6714 | 0.9125 | 49 |
Framework versions
- Transformers 4.37.2
- TensorFlow 2.11.0
- Datasets 3.3.1
- Tokenizers 0.15.2
- Downloads last month
- 8
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
HF Inference deployability: The model has no library tag.