tf_segformer_for_semantic_segmentation_v4
This model is a fine-tuned version of ArtemShyshko/tf_segformer_for_semantic_segmentation_v3 on an unknown dataset. It achieves the following results on the evaluation set:
- Train Loss: 0.1650
- Validation Loss: 0.2153
- Validation Mean Iou: 0.6876
- Validation Mean Accuracy: 0.9048
- Validation Overall Accuracy: 0.9199
- Validation Accuracy Unlabeled: nan
- Validation Accuracy Building: 0.9185
- Validation Accuracy Land: 0.9449
- Validation Accuracy Road: 0.8327
- Validation Accuracy Vegetation: 0.8659
- Validation Accuracy Water: 0.9617
- Validation Iou Unlabeled: 0.0
- Validation Iou Building: 0.8219
- Validation Iou Land: 0.9131
- Validation Iou Road: 0.7215
- Validation Iou Vegetation: 0.7242
- Validation Iou Water: 0.9446
- Epoch: 49
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': False, 'is_legacy_optimizer': False, 'learning_rate': 6e-05, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: float32
Training results
Train Loss | Validation Loss | Validation Mean Iou | Validation Mean Accuracy | Validation Overall Accuracy | Validation Accuracy Unlabeled | Validation Accuracy Building | Validation Accuracy Land | Validation Accuracy Road | Validation Accuracy Vegetation | Validation Accuracy Water | Validation Iou Unlabeled | Validation Iou Building | Validation Iou Land | Validation Iou Road | Validation Iou Vegetation | Validation Iou Water | Epoch |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0.2467 | 0.2755 | 0.6468 | 0.8761 | 0.8850 | nan | 0.9232 | 0.9014 | 0.8291 | 0.7739 | 0.9529 | 0.0 | 0.7716 | 0.8658 | 0.6733 | 0.6420 | 0.9280 | 0 |
0.2367 | 0.2769 | 0.6553 | 0.8753 | 0.8970 | nan | 0.9238 | 0.9310 | 0.7846 | 0.7702 | 0.9671 | 0.0 | 0.7730 | 0.8842 | 0.6751 | 0.6636 | 0.9356 | 1 |
0.2406 | 0.3519 | 0.6367 | 0.8676 | 0.8789 | nan | 0.8361 | 0.9081 | 0.7676 | 0.9095 | 0.9169 | 0.0 | 0.7527 | 0.8729 | 0.6552 | 0.6379 | 0.9013 | 2 |
0.2348 | 0.2933 | 0.6570 | 0.8832 | 0.9020 | nan | 0.8996 | 0.9290 | 0.8104 | 0.8389 | 0.9380 | 0.0 | 0.7799 | 0.8954 | 0.6720 | 0.6820 | 0.9125 | 3 |
0.2336 | 0.2938 | 0.6564 | 0.8851 | 0.8900 | nan | 0.9009 | 0.9052 | 0.8029 | 0.8619 | 0.9546 | 0.0 | 0.7787 | 0.8662 | 0.6791 | 0.6865 | 0.9279 | 4 |
0.2376 | 0.2802 | 0.6517 | 0.8767 | 0.8889 | nan | 0.8726 | 0.9189 | 0.7901 | 0.8472 | 0.9548 | 0.0 | 0.7803 | 0.8701 | 0.6728 | 0.6518 | 0.9351 | 5 |
0.2367 | 0.2769 | 0.6584 | 0.8787 | 0.9003 | nan | 0.9103 | 0.9317 | 0.8161 | 0.8120 | 0.9234 | 0.0 | 0.7769 | 0.8935 | 0.6936 | 0.6841 | 0.9023 | 6 |
0.2507 | 0.2647 | 0.6606 | 0.8887 | 0.8902 | nan | 0.9096 | 0.8991 | 0.8177 | 0.8655 | 0.9517 | 0.0 | 0.7848 | 0.8550 | 0.6809 | 0.7212 | 0.9216 | 7 |
0.2220 | 0.2647 | 0.6520 | 0.8755 | 0.8888 | nan | 0.9019 | 0.9207 | 0.8034 | 0.8054 | 0.9459 | 0.0 | 0.7771 | 0.8728 | 0.6830 | 0.6600 | 0.9194 | 8 |
0.2384 | 0.2602 | 0.6627 | 0.8821 | 0.8941 | nan | 0.9051 | 0.9254 | 0.8178 | 0.7913 | 0.9711 | 0.0 | 0.7900 | 0.8708 | 0.6862 | 0.6855 | 0.9437 | 9 |
0.2066 | 0.2682 | 0.6584 | 0.8797 | 0.8947 | nan | 0.8881 | 0.9245 | 0.7584 | 0.8780 | 0.9495 | 0.0 | 0.7857 | 0.8756 | 0.6560 | 0.7086 | 0.9246 | 10 |
0.2247 | 0.2975 | 0.6574 | 0.8844 | 0.8961 | nan | 0.8884 | 0.9223 | 0.7936 | 0.8745 | 0.9432 | 0.0 | 0.7915 | 0.8862 | 0.6822 | 0.6724 | 0.9123 | 11 |
0.2261 | 0.2969 | 0.6598 | 0.8815 | 0.9060 | nan | 0.9053 | 0.9372 | 0.7996 | 0.8116 | 0.9539 | 0.0 | 0.7777 | 0.8957 | 0.6733 | 0.6864 | 0.9259 | 12 |
0.2089 | 0.3076 | 0.6623 | 0.8983 | 0.9017 | nan | 0.9136 | 0.9111 | 0.8503 | 0.8682 | 0.9484 | 0.0 | 0.7756 | 0.8870 | 0.6867 | 0.7028 | 0.9220 | 13 |
0.2105 | 0.2626 | 0.6672 | 0.8844 | 0.9068 | nan | 0.8999 | 0.9426 | 0.7893 | 0.8336 | 0.9565 | 0.0 | 0.7941 | 0.8967 | 0.6793 | 0.6950 | 0.9383 | 14 |
0.2076 | 0.2772 | 0.6623 | 0.8839 | 0.8983 | nan | 0.9070 | 0.9260 | 0.7668 | 0.8572 | 0.9624 | 0.0 | 0.7831 | 0.8851 | 0.6612 | 0.7057 | 0.9388 | 15 |
0.2083 | 0.2664 | 0.6675 | 0.8814 | 0.9068 | nan | 0.8879 | 0.9491 | 0.8269 | 0.7909 | 0.9520 | 0.0 | 0.7876 | 0.8973 | 0.7071 | 0.6836 | 0.9295 | 16 |
0.2248 | 0.2665 | 0.6701 | 0.8895 | 0.9022 | nan | 0.9080 | 0.9308 | 0.8070 | 0.8489 | 0.9527 | 0.0 | 0.7985 | 0.8846 | 0.6915 | 0.7227 | 0.9233 | 17 |
0.2047 | 0.2568 | 0.6634 | 0.8846 | 0.9119 | nan | 0.9280 | 0.9419 | 0.7913 | 0.8303 | 0.9316 | 0.0 | 0.7889 | 0.9093 | 0.6815 | 0.6865 | 0.9142 | 18 |
0.2068 | 0.2512 | 0.6711 | 0.8978 | 0.8992 | nan | 0.9060 | 0.9058 | 0.8462 | 0.8982 | 0.9330 | 0.0 | 0.7949 | 0.8736 | 0.6881 | 0.7596 | 0.9101 | 19 |
0.1991 | 0.2676 | 0.6721 | 0.8920 | 0.9029 | nan | 0.9295 | 0.9232 | 0.7967 | 0.8581 | 0.9524 | 0.0 | 0.7934 | 0.8871 | 0.6849 | 0.7333 | 0.9341 | 20 |
0.2191 | 0.2659 | 0.6677 | 0.8894 | 0.9005 | nan | 0.9404 | 0.9180 | 0.8045 | 0.8251 | 0.9591 | 0.0 | 0.8064 | 0.8839 | 0.6820 | 0.6952 | 0.9384 | 21 |
0.1970 | 0.2853 | 0.6735 | 0.8995 | 0.9099 | nan | 0.9171 | 0.9268 | 0.8364 | 0.8649 | 0.9520 | 0.0 | 0.8005 | 0.9006 | 0.7030 | 0.7028 | 0.9338 | 22 |
0.1915 | 0.2762 | 0.6668 | 0.8788 | 0.9035 | nan | 0.9152 | 0.9446 | 0.8138 | 0.7688 | 0.9519 | 0.0 | 0.7978 | 0.8899 | 0.7077 | 0.6770 | 0.9286 | 23 |
0.2052 | 0.2811 | 0.6643 | 0.8813 | 0.9016 | nan | 0.8935 | 0.9396 | 0.7971 | 0.8198 | 0.9563 | 0.0 | 0.7984 | 0.8828 | 0.6914 | 0.6915 | 0.9218 | 24 |
0.1869 | 0.2700 | 0.6673 | 0.8848 | 0.9033 | nan | 0.9119 | 0.9345 | 0.7530 | 0.8789 | 0.9458 | 0.0 | 0.7954 | 0.8913 | 0.6736 | 0.7233 | 0.9201 | 25 |
0.1792 | 0.2457 | 0.6718 | 0.8948 | 0.9091 | nan | 0.9382 | 0.9292 | 0.8370 | 0.8277 | 0.9422 | 0.0 | 0.7998 | 0.9052 | 0.6937 | 0.7086 | 0.9236 | 26 |
0.1800 | 0.2548 | 0.6763 | 0.8980 | 0.9066 | nan | 0.9288 | 0.9219 | 0.8142 | 0.8690 | 0.9562 | 0.0 | 0.8098 | 0.8899 | 0.6983 | 0.7221 | 0.9378 | 27 |
0.1817 | 0.2376 | 0.6722 | 0.8916 | 0.9119 | nan | 0.9263 | 0.9397 | 0.8103 | 0.8333 | 0.9483 | 0.0 | 0.8046 | 0.9095 | 0.6872 | 0.7042 | 0.9275 | 28 |
0.1819 | 0.2529 | 0.6830 | 0.9084 | 0.9146 | nan | 0.9167 | 0.9243 | 0.8381 | 0.8899 | 0.9731 | 0.0 | 0.8121 | 0.9013 | 0.6987 | 0.7376 | 0.9484 | 29 |
0.1886 | 0.2466 | 0.6713 | 0.8867 | 0.9029 | nan | 0.9136 | 0.9348 | 0.7895 | 0.8383 | 0.9571 | 0.0 | 0.7964 | 0.8824 | 0.6828 | 0.7288 | 0.9374 | 30 |
0.2018 | 0.2568 | 0.6678 | 0.8849 | 0.9069 | nan | 0.9198 | 0.9402 | 0.8335 | 0.7745 | 0.9564 | 0.0 | 0.7950 | 0.8920 | 0.7093 | 0.6994 | 0.9114 | 31 |
0.1954 | 0.2296 | 0.6734 | 0.8940 | 0.9163 | nan | 0.8895 | 0.9466 | 0.8254 | 0.8546 | 0.9538 | 0.0 | 0.7993 | 0.9096 | 0.6943 | 0.7081 | 0.9292 | 32 |
0.1849 | 0.2215 | 0.6684 | 0.8923 | 0.9023 | nan | 0.9052 | 0.9234 | 0.7964 | 0.8880 | 0.9486 | 0.0 | 0.8043 | 0.8871 | 0.6884 | 0.7043 | 0.9263 | 33 |
0.1941 | 0.2405 | 0.6742 | 0.8964 | 0.9164 | nan | 0.9040 | 0.9440 | 0.8431 | 0.8437 | 0.9471 | 0.0 | 0.7987 | 0.9139 | 0.7265 | 0.6853 | 0.9210 | 34 |
0.1881 | 0.2605 | 0.6642 | 0.8888 | 0.9023 | nan | 0.8699 | 0.9318 | 0.8009 | 0.8862 | 0.9553 | 0.0 | 0.7893 | 0.8879 | 0.6939 | 0.6788 | 0.9353 | 35 |
0.1943 | 0.2699 | 0.6777 | 0.8968 | 0.9079 | nan | 0.9107 | 0.9318 | 0.8230 | 0.8593 | 0.9592 | 0.0 | 0.8178 | 0.8914 | 0.7058 | 0.7155 | 0.9358 | 36 |
0.1872 | 0.2382 | 0.6859 | 0.9060 | 0.9130 | nan | 0.9350 | 0.9265 | 0.8462 | 0.8647 | 0.9575 | 0.0 | 0.8186 | 0.8951 | 0.7148 | 0.7501 | 0.9367 | 37 |
0.1887 | 0.2526 | 0.6795 | 0.8990 | 0.9080 | nan | 0.9147 | 0.9278 | 0.8445 | 0.8660 | 0.9418 | 0.0 | 0.8114 | 0.8844 | 0.7228 | 0.7462 | 0.9125 | 38 |
0.1865 | 0.2354 | 0.6789 | 0.8966 | 0.9103 | nan | 0.9246 | 0.9360 | 0.8430 | 0.8216 | 0.9576 | 0.0 | 0.8185 | 0.8953 | 0.7082 | 0.7200 | 0.9316 | 39 |
0.1828 | 0.2307 | 0.6804 | 0.8986 | 0.9101 | nan | 0.8899 | 0.9370 | 0.8666 | 0.8531 | 0.9464 | 0.0 | 0.8115 | 0.8902 | 0.7156 | 0.7484 | 0.9166 | 40 |
0.1804 | 0.2335 | 0.6764 | 0.8951 | 0.9114 | nan | 0.9012 | 0.9405 | 0.8371 | 0.8464 | 0.9507 | 0.0 | 0.8047 | 0.8982 | 0.7301 | 0.7010 | 0.9245 | 41 |
0.1795 | 0.2624 | 0.6826 | 0.8998 | 0.9151 | nan | 0.9294 | 0.9372 | 0.8116 | 0.8699 | 0.9510 | 0.0 | 0.8257 | 0.9003 | 0.7014 | 0.7381 | 0.9304 | 42 |
0.1777 | 0.2251 | 0.6863 | 0.9010 | 0.9164 | nan | 0.9130 | 0.9427 | 0.8211 | 0.8754 | 0.9528 | 0.0 | 0.8200 | 0.9004 | 0.7167 | 0.7506 | 0.9301 | 43 |
0.1750 | 0.2294 | 0.6769 | 0.8958 | 0.9019 | nan | 0.9304 | 0.9174 | 0.8202 | 0.8608 | 0.9502 | 0.0 | 0.8213 | 0.8742 | 0.7147 | 0.7251 | 0.9258 | 44 |
0.1669 | 0.2141 | 0.6816 | 0.9014 | 0.9107 | nan | 0.9209 | 0.9305 | 0.8525 | 0.8350 | 0.9681 | 0.0 | 0.8216 | 0.8964 | 0.7116 | 0.7166 | 0.9432 | 45 |
0.1622 | 0.2484 | 0.6736 | 0.8954 | 0.9093 | nan | 0.9411 | 0.9309 | 0.8257 | 0.8152 | 0.9639 | 0.0 | 0.8113 | 0.8988 | 0.7015 | 0.6999 | 0.9302 | 46 |
0.1859 | 0.2145 | 0.6757 | 0.8942 | 0.9134 | nan | 0.9466 | 0.9325 | 0.8332 | 0.7897 | 0.9689 | 0.0 | 0.8232 | 0.9005 | 0.7140 | 0.6672 | 0.9491 | 47 |
0.1693 | 0.2243 | 0.6817 | 0.9000 | 0.9120 | nan | 0.9198 | 0.9342 | 0.8038 | 0.8742 | 0.9682 | 0.0 | 0.8232 | 0.8962 | 0.6932 | 0.7314 | 0.9461 | 48 |
0.1650 | 0.2153 | 0.6876 | 0.9048 | 0.9199 | nan | 0.9185 | 0.9449 | 0.8327 | 0.8659 | 0.9617 | 0.0 | 0.8219 | 0.9131 | 0.7215 | 0.7242 | 0.9446 | 49 |
Framework versions
- Transformers 4.37.2
- TensorFlow 2.11.0
- Datasets 3.3.1
- Tokenizers 0.15.2
- Downloads last month
- 6
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
HF Inference deployability: The model has no library tag.
Model tree for ArtemShyshko/tf_segformer_for_semantic_segmentation_v4
Base model
nvidia/mit-b0