segformer-finetuned-blueberries-3classes_final_w_aug

This model is a fine-tuned version of nvidia/mit-b0 on the sankarip/blue_berry_data_3classes_final_w_aug dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1885
  • Mean Iou: 0.3832
  • Mean Accuracy: 0.5196
  • Overall Accuracy: 0.8996
  • Accuracy Background: nan
  • Accuracy Berry: 0.9462
  • Accuracy Debris: 0.2946
  • Accuracy Greenberry: 0.3179
  • Iou Background: 0.0
  • Iou Berry: 0.9393
  • Iou Debris: 0.2765
  • Iou Greenberry: 0.3171

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 1337
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: polynomial
  • training_steps: 4000

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Background Accuracy Berry Accuracy Debris Accuracy Greenberry Iou Background Iou Berry Iou Debris Iou Greenberry
No log 1.0 30 0.8151 0.3153 0.4411 0.8407 nan 0.8817 0.0032 0.4383 0.0 0.8616 0.0032 0.3967
No log 2.0 60 0.4668 0.3668 0.5138 0.8848 nan 0.9188 0.0077 0.6149 0.0 0.9107 0.0077 0.5490
No log 3.0 90 0.3764 0.3651 0.5148 0.8676 nan 0.8988 0.0050 0.6408 0.0 0.8934 0.0050 0.5621
0.6234 4.0 120 0.3360 0.3528 0.4941 0.8517 nan 0.8844 0.0039 0.5941 0.0 0.8796 0.0039 0.5276
0.6234 5.0 150 0.3083 0.3813 0.5264 0.9045 nan 0.9402 0.0368 0.6021 0.0 0.9306 0.0366 0.5581
0.6234 6.0 180 0.2831 0.3611 0.4920 0.8379 nan 0.8713 0.0627 0.5420 0.0 0.8675 0.0627 0.5143
0.3285 7.0 210 0.2698 0.3364 0.4530 0.8340 nan 0.8745 0.0727 0.4117 0.0 0.8712 0.0726 0.4017
0.3285 8.0 240 0.2419 0.3606 0.4869 0.8626 nan 0.9022 0.1045 0.4540 0.0 0.8966 0.1032 0.4427
0.3285 9.0 270 0.2431 0.3583 0.4833 0.8338 nan 0.8701 0.1090 0.4709 0.0 0.8672 0.1078 0.4580
0.2612 10.0 300 0.2555 0.3600 0.4849 0.8635 nan 0.9049 0.1337 0.4161 0.0 0.8995 0.1318 0.4088
0.2612 11.0 330 0.2347 0.3834 0.5180 0.8544 nan 0.8888 0.1458 0.5194 0.0 0.8855 0.1425 0.5057
0.2612 12.0 360 0.2266 0.3675 0.4948 0.8608 nan 0.9006 0.1514 0.4325 0.0 0.8963 0.1478 0.4258
0.2612 13.0 390 0.2207 0.3706 0.4988 0.8754 nan 0.9168 0.1581 0.4215 0.0 0.9121 0.1535 0.4169
0.2294 14.0 420 0.2194 0.3873 0.5234 0.8573 nan 0.8923 0.1762 0.5016 0.0 0.8890 0.1689 0.4913
0.2294 15.0 450 0.2048 0.3960 0.5349 0.8778 nan 0.9137 0.1781 0.5128 0.0 0.9091 0.1719 0.5031
0.2294 16.0 480 0.2131 0.3771 0.5087 0.8739 nan 0.9145 0.1878 0.4238 0.0 0.9102 0.1788 0.4194
0.21 17.0 510 0.2055 0.4095 0.5540 0.9062 nan 0.9432 0.1909 0.5279 0.0 0.9377 0.1833 0.5169
0.21 18.0 540 0.2089 0.3820 0.5154 0.8596 nan 0.8975 0.2025 0.4462 0.0 0.8950 0.1921 0.4410
0.21 19.0 570 0.2051 0.3809 0.5126 0.8855 nan 0.9268 0.1798 0.4312 0.0 0.9214 0.1754 0.4269
0.1978 20.0 600 0.2054 0.3785 0.5095 0.8697 nan 0.9098 0.1930 0.4256 0.0 0.9066 0.1855 0.4217
0.1978 21.0 630 0.2101 0.3633 0.4876 0.8318 nan 0.8696 0.1746 0.4187 0.0 0.8673 0.1716 0.4142
0.1978 22.0 660 0.2048 0.3622 0.4865 0.8662 nan 0.9098 0.1873 0.3623 0.0 0.9061 0.1822 0.3606
0.1978 23.0 690 0.1930 0.3969 0.5352 0.8894 nan 0.9280 0.2049 0.4727 0.0 0.9237 0.1981 0.4657
0.1833 24.0 720 0.1921 0.4111 0.5571 0.8989 nan 0.9362 0.2409 0.4941 0.0 0.9310 0.2264 0.4872
0.1833 25.0 750 0.1964 0.3812 0.5129 0.8908 nan 0.9339 0.2079 0.3969 0.0 0.9287 0.2017 0.3945
0.1833 26.0 780 0.1931 0.3742 0.5018 0.8758 nan 0.9177 0.1820 0.4056 0.0 0.9139 0.1805 0.4024
0.1867 27.0 810 0.1958 0.3644 0.4890 0.8638 nan 0.9066 0.1890 0.3714 0.0 0.9033 0.1847 0.3695
0.1867 28.0 840 0.1892 0.3954 0.5318 0.8893 nan 0.9273 0.1743 0.4937 0.0 0.9229 0.1733 0.4855
0.1867 29.0 870 0.2003 0.3939 0.5308 0.9083 nan 0.9510 0.2187 0.4225 0.0 0.9441 0.2113 0.4200
0.1821 30.0 900 0.1854 0.3975 0.5336 0.8888 nan 0.9277 0.2062 0.4668 0.0 0.9239 0.2046 0.4612
0.1821 31.0 930 0.1889 0.3836 0.5154 0.8794 nan 0.9213 0.2309 0.3939 0.0 0.9181 0.2248 0.3914
0.1821 32.0 960 0.1874 0.4134 0.5588 0.8895 nan 0.9257 0.2542 0.4964 0.0 0.9210 0.2431 0.4897
0.1821 33.0 990 0.1993 0.3867 0.5208 0.8640 nan 0.9030 0.2403 0.4192 0.0 0.9003 0.2315 0.4150
0.1785 34.0 1020 0.1970 0.4020 0.5429 0.9142 nan 0.9564 0.2407 0.4316 0.0 0.9496 0.2306 0.4280
0.1785 35.0 1050 0.1925 0.3935 0.5290 0.9010 nan 0.9423 0.1994 0.4454 0.0 0.9365 0.1964 0.4411
0.1785 36.0 1080 0.1938 0.3867 0.5202 0.9006 nan 0.9449 0.2351 0.3805 0.0 0.9390 0.2293 0.3787
0.1693 37.0 1110 0.1852 0.4027 0.5436 0.8962 nan 0.9369 0.2717 0.4223 0.0 0.9321 0.2598 0.4191
0.1693 38.0 1140 0.1943 0.3970 0.5370 0.9019 nan 0.9441 0.2560 0.4110 0.0 0.9383 0.2424 0.4075
0.1693 39.0 1170 0.1874 0.4214 0.5702 0.9130 nan 0.9505 0.2554 0.5046 0.0 0.9439 0.2442 0.4975
0.1668 40.0 1200 0.1870 0.3924 0.5298 0.8884 nan 0.9302 0.2650 0.3941 0.0 0.9261 0.2515 0.3919
0.1668 41.0 1230 0.1874 0.3730 0.5034 0.8887 nan 0.9351 0.2528 0.3224 0.0 0.9300 0.2407 0.3214
0.1668 42.0 1260 0.1895 0.4208 0.5693 0.9170 nan 0.9561 0.2749 0.4770 0.0 0.9484 0.2620 0.4727
0.1668 43.0 1290 0.1905 0.3731 0.5025 0.8973 nan 0.9445 0.2386 0.3244 0.0 0.9383 0.2308 0.3233
0.1693 44.0 1320 0.2006 0.4134 0.5608 0.9205 nan 0.9617 0.2750 0.4457 0.0 0.9519 0.2592 0.4423
0.1693 45.0 1350 0.1893 0.3796 0.5106 0.9038 nan 0.9498 0.2227 0.3594 0.0 0.9428 0.2179 0.3576
0.1693 46.0 1380 0.1935 0.3804 0.5127 0.9079 nan 0.9539 0.2160 0.3680 0.0 0.9459 0.2094 0.3662
0.1659 47.0 1410 0.1888 0.3650 0.4910 0.8868 nan 0.9337 0.2146 0.3246 0.0 0.9288 0.2079 0.3233
0.1659 48.0 1440 0.1884 0.3773 0.5084 0.9027 nan 0.9495 0.2355 0.3401 0.0 0.9428 0.2280 0.3384
0.1659 49.0 1470 0.1913 0.3944 0.5331 0.9063 nan 0.9507 0.2757 0.3730 0.0 0.9439 0.2623 0.3712
0.1638 50.0 1500 0.1820 0.4074 0.5510 0.8936 nan 0.9333 0.2896 0.4301 0.0 0.9283 0.2743 0.4270
0.1638 51.0 1530 0.1954 0.3931 0.5320 0.9166 nan 0.9611 0.2373 0.3977 0.0 0.9514 0.2253 0.3956
0.1638 52.0 1560 0.1939 0.4023 0.5504 0.9102 nan 0.9546 0.3453 0.3512 0.0 0.9472 0.3122 0.3498
0.1638 53.0 1590 0.1844 0.4046 0.5514 0.9008 nan 0.9430 0.3280 0.3832 0.0 0.9379 0.2989 0.3817
0.1614 54.0 1620 0.1852 0.3934 0.5316 0.8952 nan 0.9383 0.2780 0.3786 0.0 0.9326 0.2645 0.3767
0.1614 55.0 1650 0.1813 0.4034 0.5444 0.8977 nan 0.9391 0.2872 0.4069 0.0 0.9342 0.2750 0.4044
0.1614 56.0 1680 0.1927 0.3839 0.5199 0.9035 nan 0.9499 0.2752 0.3345 0.0 0.9424 0.2598 0.3335
0.1595 57.0 1710 0.1929 0.3878 0.5234 0.9130 nan 0.9584 0.2312 0.3806 0.0 0.9489 0.2237 0.3786
0.1595 58.0 1740 0.1840 0.3987 0.5417 0.8968 nan 0.9394 0.3068 0.3789 0.0 0.9341 0.2838 0.3769
0.1595 59.0 1770 0.1812 0.4011 0.5440 0.8935 nan 0.9355 0.3139 0.3828 0.0 0.9307 0.2928 0.3807
0.1583 60.0 1800 0.1852 0.3848 0.5214 0.8898 nan 0.9350 0.3050 0.3240 0.0 0.9300 0.2864 0.3230
0.1583 61.0 1830 0.1871 0.4036 0.5478 0.8934 nan 0.9350 0.3236 0.3849 0.0 0.9297 0.3016 0.3829
0.1583 62.0 1860 0.1884 0.3968 0.5377 0.9036 nan 0.9480 0.3098 0.3554 0.0 0.9415 0.2915 0.3542
0.1583 63.0 1890 0.1826 0.3930 0.5318 0.8935 nan 0.9372 0.2990 0.3594 0.0 0.9320 0.2819 0.3580
0.1579 64.0 1920 0.1809 0.4386 0.5969 0.9103 nan 0.9462 0.3475 0.4970 0.0 0.9402 0.3216 0.4924
0.1579 65.0 1950 0.1861 0.4011 0.5459 0.9027 nan 0.9462 0.3275 0.3641 0.0 0.9403 0.3013 0.3628
0.1579 66.0 1980 0.2017 0.3952 0.5359 0.9201 nan 0.9660 0.2787 0.3628 0.0 0.9550 0.2640 0.3616
0.159 67.0 2010 0.1943 0.4073 0.5499 0.9142 nan 0.9570 0.2859 0.4067 0.0 0.9489 0.2752 0.4051
0.159 68.0 2040 0.1900 0.3703 0.4997 0.8970 nan 0.9451 0.2492 0.3049 0.0 0.9386 0.2384 0.3042
0.159 69.0 2070 0.1925 0.3961 0.5384 0.9103 nan 0.9548 0.2907 0.3696 0.0 0.9453 0.2706 0.3684
0.1592 70.0 2100 0.1822 0.4166 0.5648 0.9049 nan 0.9447 0.3153 0.4344 0.0 0.9387 0.2956 0.4320
0.1592 71.0 2130 0.1881 0.3927 0.5316 0.9038 nan 0.9485 0.2881 0.3582 0.0 0.9413 0.2728 0.3569
0.1592 72.0 2160 0.1855 0.3996 0.5398 0.9063 nan 0.9490 0.2670 0.4035 0.0 0.9423 0.2554 0.4007
0.1592 73.0 2190 0.1894 0.3988 0.5415 0.9107 nan 0.9546 0.2857 0.3841 0.0 0.9468 0.2665 0.3819
0.1576 74.0 2220 0.1885 0.4126 0.5585 0.9184 nan 0.9603 0.2908 0.4244 0.0 0.9525 0.2758 0.4221
0.1576 75.0 2250 0.1896 0.3922 0.5304 0.9074 nan 0.9520 0.2641 0.3753 0.0 0.9444 0.2511 0.3734
0.1576 76.0 2280 0.1906 0.4095 0.5567 0.9167 nan 0.9584 0.2846 0.4272 0.0 0.9502 0.2631 0.4246
0.1524 77.0 2310 0.1940 0.3895 0.5293 0.9078 nan 0.9535 0.2861 0.3483 0.0 0.9450 0.2662 0.3469
0.1524 78.0 2340 0.1923 0.3836 0.5209 0.9016 nan 0.9483 0.2942 0.3201 0.0 0.9412 0.2742 0.3192
0.1524 79.0 2370 0.1990 0.3812 0.5188 0.9007 nan 0.9479 0.3027 0.3058 0.0 0.9407 0.2790 0.3050
0.1495 80.0 2400 0.1871 0.3912 0.5302 0.9051 nan 0.9500 0.2813 0.3594 0.0 0.9431 0.2642 0.3576
0.1495 81.0 2430 0.1917 0.3673 0.4980 0.8866 nan 0.9348 0.2825 0.2767 0.0 0.9290 0.2640 0.2761
0.1495 82.0 2460 0.1853 0.4002 0.5448 0.8950 nan 0.9376 0.3282 0.3685 0.0 0.9327 0.3014 0.3665
0.1495 83.0 2490 0.1906 0.3866 0.5242 0.8916 nan 0.9362 0.2950 0.3413 0.0 0.9304 0.2756 0.3403
0.1546 84.0 2520 0.1865 0.3935 0.5336 0.8963 nan 0.9404 0.3079 0.3525 0.0 0.9344 0.2886 0.3511
0.1546 85.0 2550 0.1876 0.4078 0.5561 0.9047 nan 0.9473 0.3455 0.3754 0.0 0.9413 0.3161 0.3740
0.1546 86.0 2580 0.1862 0.4023 0.5470 0.9000 nan 0.9428 0.3258 0.3724 0.0 0.9373 0.3007 0.3712
0.1491 87.0 2610 0.1851 0.3995 0.5442 0.8899 nan 0.9321 0.3348 0.3656 0.0 0.9268 0.3070 0.3642
0.1491 88.0 2640 0.1858 0.3991 0.5408 0.8989 nan 0.9420 0.3071 0.3732 0.0 0.9360 0.2886 0.3716
0.1491 89.0 2670 0.1905 0.3802 0.5140 0.9009 nan 0.9476 0.2657 0.3287 0.0 0.9400 0.2530 0.3278
0.1497 90.0 2700 0.1848 0.4055 0.5504 0.9061 nan 0.9487 0.3135 0.3891 0.0 0.9420 0.2928 0.3872
0.1497 91.0 2730 0.1848 0.3820 0.5162 0.8942 nan 0.9400 0.2801 0.3285 0.0 0.9344 0.2660 0.3275
0.1497 92.0 2760 0.1848 0.3930 0.5329 0.8932 nan 0.9371 0.3115 0.3502 0.0 0.9317 0.2912 0.3491
0.1497 93.0 2790 0.1870 0.3861 0.5208 0.9005 nan 0.9453 0.2520 0.3652 0.0 0.9383 0.2424 0.3637
0.1514 94.0 2820 0.1859 0.3767 0.5093 0.8899 nan 0.9369 0.2909 0.3001 0.0 0.9314 0.2760 0.2993
0.1514 95.0 2850 0.1890 0.3694 0.4989 0.8941 nan 0.9425 0.2629 0.2913 0.0 0.9364 0.2505 0.2906
0.1514 96.0 2880 0.1896 0.3852 0.5223 0.9015 nan 0.9475 0.2847 0.3349 0.0 0.9398 0.2672 0.3339
0.1479 97.0 2910 0.1882 0.3834 0.5203 0.9015 nan 0.9483 0.2948 0.3179 0.0 0.9419 0.2748 0.3170
0.1479 98.0 2940 0.1907 0.3706 0.5011 0.9003 nan 0.9490 0.2579 0.2965 0.0 0.9412 0.2453 0.2957
0.1479 99.0 2970 0.1915 0.3766 0.5098 0.9011 nan 0.9485 0.2628 0.3180 0.0 0.9413 0.2481 0.3172
0.1498 100.0 3000 0.1854 0.3950 0.5350 0.9018 nan 0.9457 0.2901 0.3690 0.0 0.9394 0.2732 0.3676
0.1498 101.0 3030 0.1885 0.3856 0.5230 0.8986 nan 0.9444 0.2926 0.3321 0.0 0.9377 0.2735 0.3311
0.1498 102.0 3060 0.1863 0.3970 0.5375 0.9032 nan 0.9466 0.2829 0.3830 0.0 0.9395 0.2668 0.3815
0.1498 103.0 3090 0.1933 0.3840 0.5218 0.9060 nan 0.9530 0.2910 0.3215 0.0 0.9448 0.2708 0.3206
0.1497 104.0 3120 0.1885 0.3784 0.5113 0.8983 nan 0.9451 0.2657 0.3231 0.0 0.9382 0.2528 0.3224
0.1497 105.0 3150 0.1882 0.3827 0.5196 0.8948 nan 0.9407 0.2936 0.3245 0.0 0.9346 0.2726 0.3239
0.1497 106.0 3180 0.1865 0.3922 0.5317 0.9056 nan 0.9510 0.2974 0.3466 0.0 0.9434 0.2797 0.3456
0.1511 107.0 3210 0.1890 0.3874 0.5245 0.9022 nan 0.9478 0.2814 0.3443 0.0 0.9406 0.2659 0.3432
0.1511 108.0 3240 0.1856 0.3831 0.5188 0.8972 nan 0.9435 0.2921 0.3209 0.0 0.9375 0.2747 0.3201
0.1511 109.0 3270 0.1878 0.3912 0.5309 0.9023 nan 0.9474 0.3005 0.3447 0.0 0.9409 0.2804 0.3436
0.1476 110.0 3300 0.1921 0.3856 0.5217 0.9081 nan 0.9547 0.2736 0.3367 0.0 0.9464 0.2603 0.3357
0.1476 111.0 3330 0.1888 0.3838 0.5204 0.9019 nan 0.9486 0.2902 0.3224 0.0 0.9412 0.2723 0.3216
0.1476 112.0 3360 0.1873 0.3848 0.5215 0.8992 nan 0.9454 0.2946 0.3244 0.0 0.9391 0.2765 0.3235
0.1476 113.0 3390 0.1902 0.3814 0.5168 0.9033 nan 0.9505 0.2829 0.3170 0.0 0.9429 0.2667 0.3162
0.1489 114.0 3420 0.1876 0.3858 0.5224 0.8996 nan 0.9453 0.2859 0.3361 0.0 0.9386 0.2697 0.3351
0.1489 115.0 3450 0.1885 0.3864 0.5242 0.8990 nan 0.9448 0.2973 0.3305 0.0 0.9385 0.2775 0.3295
0.1489 116.0 3480 0.1855 0.3909 0.5298 0.8980 nan 0.9426 0.2992 0.3477 0.0 0.9366 0.2806 0.3465
0.1433 117.0 3510 0.1869 0.3916 0.5314 0.8980 nan 0.9423 0.2974 0.3547 0.0 0.9360 0.2773 0.3531
0.1433 118.0 3540 0.1897 0.3818 0.5174 0.8978 nan 0.9441 0.2827 0.3254 0.0 0.9373 0.2655 0.3245
0.1433 119.0 3570 0.1879 0.3930 0.5341 0.9040 nan 0.9490 0.3061 0.3474 0.0 0.9415 0.2844 0.3462
0.1457 120.0 3600 0.1897 0.3840 0.5219 0.8976 nan 0.9440 0.3084 0.3133 0.0 0.9372 0.2863 0.3124
0.1457 121.0 3630 0.1887 0.3880 0.5262 0.9007 nan 0.9464 0.2964 0.3359 0.0 0.9392 0.2779 0.3349
0.1457 122.0 3660 0.1856 0.3953 0.5360 0.9030 nan 0.9473 0.3000 0.3607 0.0 0.9405 0.2816 0.3592
0.1457 123.0 3690 0.1909 0.3888 0.5270 0.9054 nan 0.9515 0.2943 0.3353 0.0 0.9439 0.2770 0.3343
0.1454 124.0 3720 0.1866 0.3905 0.5285 0.9024 nan 0.9473 0.2822 0.3559 0.0 0.9404 0.2672 0.3545
0.1454 125.0 3750 0.1886 0.3853 0.5224 0.9018 nan 0.9482 0.2927 0.3263 0.0 0.9409 0.2751 0.3254
0.1454 126.0 3780 0.1876 0.3872 0.5254 0.8998 nan 0.9456 0.3014 0.3291 0.0 0.9387 0.2817 0.3282
0.1459 127.0 3810 0.1885 0.3910 0.5302 0.9050 nan 0.9506 0.2988 0.3413 0.0 0.9435 0.2804 0.3402
0.1459 128.0 3840 0.1879 0.3865 0.5244 0.9010 nan 0.9473 0.3040 0.3218 0.0 0.9406 0.2844 0.3209
0.1459 129.0 3870 0.1894 0.3790 0.5135 0.8984 nan 0.9456 0.2849 0.3099 0.0 0.9384 0.2685 0.3092
0.1476 130.0 3900 0.1880 0.3867 0.5244 0.8996 nan 0.9456 0.2996 0.3280 0.0 0.9387 0.2812 0.3271
0.1476 131.0 3930 0.1890 0.3859 0.5231 0.9011 nan 0.9473 0.2948 0.3271 0.0 0.9402 0.2773 0.3262
0.1476 132.0 3960 0.1903 0.3867 0.5245 0.9047 nan 0.9513 0.2980 0.3244 0.0 0.9438 0.2794 0.3235
0.1476 133.0 3990 0.1888 0.3863 0.5237 0.9027 nan 0.9491 0.2982 0.3239 0.0 0.9421 0.2799 0.3230
0.1454 133.3333 4000 0.1885 0.3832 0.5196 0.8996 nan 0.9462 0.2946 0.3179 0.0 0.9393 0.2765 0.3171

Framework versions

  • Transformers 4.48.0
  • Pytorch 2.4.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
101
Safetensors
Model size
3.72M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for sankarip/segformer-finetuned-blueberries-3classes_final_w_aug

Base model

nvidia/mit-b0
Finetuned
(456)
this model