yolo_finetuned_fruits
This model is a fine-tuned version of hustvl/yolos-tiny on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.7836
- Map: 0.5785
- Map 50: 0.8356
- Map 75: 0.6723
- Map Small: -1.0
- Map Medium: 0.5125
- Map Large: 0.605
- Mar 1: 0.4248
- Mar 10: 0.7284
- Mar 100: 0.7686
- Mar Small: -1.0
- Mar Medium: 0.6125
- Mar Large: 0.7829
- Map Banana: 0.448
- Mar 100 Banana: 0.72
- Map Orange: 0.6045
- Mar 100 Orange: 0.7857
- Map Apple: 0.6831
- Mar 100 Apple: 0.8
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Banana | Mar 100 Banana | Map Orange | Mar 100 Orange | Map Apple | Mar 100 Apple |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 60 | 2.2392 | 0.0133 | 0.0374 | 0.0065 | -1.0 | 0.0006 | 0.0174 | 0.0367 | 0.1159 | 0.2228 | -1.0 | 0.075 | 0.2375 | 0.0033 | 0.295 | 0.0055 | 0.019 | 0.031 | 0.3543 |
No log | 2.0 | 120 | 1.8045 | 0.0433 | 0.094 | 0.035 | -1.0 | 0.0841 | 0.0463 | 0.1148 | 0.2667 | 0.4661 | -1.0 | 0.3708 | 0.4806 | 0.0131 | 0.425 | 0.0335 | 0.419 | 0.0834 | 0.5543 |
No log | 3.0 | 180 | 1.7343 | 0.0758 | 0.1809 | 0.0542 | -1.0 | 0.0666 | 0.0765 | 0.1559 | 0.3357 | 0.473 | -1.0 | 0.3708 | 0.4901 | 0.0802 | 0.39 | 0.0401 | 0.4548 | 0.107 | 0.5743 |
No log | 4.0 | 240 | 1.5930 | 0.0667 | 0.1545 | 0.0477 | -1.0 | 0.0345 | 0.0729 | 0.1339 | 0.3051 | 0.4819 | -1.0 | 0.2167 | 0.5061 | 0.0823 | 0.4875 | 0.0565 | 0.3524 | 0.0614 | 0.6057 |
No log | 5.0 | 300 | 1.4399 | 0.08 | 0.1519 | 0.0659 | -1.0 | 0.0812 | 0.0899 | 0.1599 | 0.327 | 0.5297 | -1.0 | 0.35 | 0.5466 | 0.0811 | 0.4925 | 0.0724 | 0.4595 | 0.0867 | 0.6371 |
No log | 6.0 | 360 | 1.2057 | 0.1493 | 0.2472 | 0.1804 | -1.0 | 0.1378 | 0.1618 | 0.2595 | 0.4663 | 0.6235 | -1.0 | 0.3542 | 0.6502 | 0.0964 | 0.5825 | 0.1548 | 0.6167 | 0.1967 | 0.6714 |
No log | 7.0 | 420 | 1.1930 | 0.2454 | 0.4068 | 0.2628 | -1.0 | 0.1931 | 0.2652 | 0.2975 | 0.4886 | 0.6008 | -1.0 | 0.3625 | 0.6243 | 0.1301 | 0.53 | 0.2107 | 0.5952 | 0.3953 | 0.6771 |
No log | 8.0 | 480 | 1.1520 | 0.3021 | 0.5017 | 0.3603 | -1.0 | 0.2696 | 0.3272 | 0.3091 | 0.5556 | 0.6268 | -1.0 | 0.4083 | 0.6477 | 0.136 | 0.57 | 0.2458 | 0.5905 | 0.5244 | 0.72 |
1.4531 | 9.0 | 540 | 1.0371 | 0.3781 | 0.5892 | 0.4062 | -1.0 | 0.3088 | 0.3964 | 0.3496 | 0.6028 | 0.6662 | -1.0 | 0.3958 | 0.6901 | 0.2285 | 0.63 | 0.3607 | 0.6429 | 0.5451 | 0.7257 |
1.4531 | 10.0 | 600 | 1.0391 | 0.3811 | 0.6249 | 0.4312 | -1.0 | 0.2525 | 0.4061 | 0.3532 | 0.6144 | 0.6606 | -1.0 | 0.4167 | 0.6837 | 0.2649 | 0.625 | 0.2871 | 0.631 | 0.5912 | 0.7257 |
1.4531 | 11.0 | 660 | 0.9947 | 0.4314 | 0.6884 | 0.4616 | -1.0 | 0.2102 | 0.4734 | 0.3681 | 0.6204 | 0.678 | -1.0 | 0.4 | 0.7046 | 0.2683 | 0.6025 | 0.449 | 0.7 | 0.5768 | 0.7314 |
1.4531 | 12.0 | 720 | 1.0551 | 0.4382 | 0.7558 | 0.4724 | -1.0 | 0.2711 | 0.4696 | 0.339 | 0.6118 | 0.6658 | -1.0 | 0.475 | 0.6833 | 0.2939 | 0.6325 | 0.4729 | 0.6762 | 0.5477 | 0.6886 |
1.4531 | 13.0 | 780 | 0.9251 | 0.4752 | 0.7361 | 0.5321 | -1.0 | 0.3079 | 0.5056 | 0.3823 | 0.6394 | 0.7055 | -1.0 | 0.4667 | 0.7265 | 0.333 | 0.6375 | 0.4894 | 0.6905 | 0.6033 | 0.7886 |
1.4531 | 14.0 | 840 | 0.8957 | 0.4906 | 0.7363 | 0.5688 | -1.0 | 0.34 | 0.5195 | 0.3813 | 0.6715 | 0.7187 | -1.0 | 0.5208 | 0.7345 | 0.3125 | 0.66 | 0.52 | 0.7333 | 0.6394 | 0.7629 |
1.4531 | 15.0 | 900 | 0.9153 | 0.4978 | 0.7646 | 0.5708 | -1.0 | 0.41 | 0.5297 | 0.401 | 0.6679 | 0.7131 | -1.0 | 0.5708 | 0.7275 | 0.3437 | 0.6275 | 0.5364 | 0.7548 | 0.6133 | 0.7571 |
1.4531 | 16.0 | 960 | 0.8663 | 0.5276 | 0.7993 | 0.576 | -1.0 | 0.3697 | 0.5634 | 0.4088 | 0.6738 | 0.7315 | -1.0 | 0.525 | 0.7493 | 0.3965 | 0.675 | 0.5225 | 0.731 | 0.6638 | 0.7886 |
0.7981 | 17.0 | 1020 | 0.8745 | 0.5359 | 0.8136 | 0.5912 | -1.0 | 0.3684 | 0.5684 | 0.4217 | 0.6903 | 0.7463 | -1.0 | 0.5458 | 0.765 | 0.3881 | 0.68 | 0.5621 | 0.7762 | 0.6575 | 0.7829 |
0.7981 | 18.0 | 1080 | 0.8692 | 0.5375 | 0.814 | 0.6356 | -1.0 | 0.4627 | 0.5653 | 0.4139 | 0.6979 | 0.7461 | -1.0 | 0.6083 | 0.76 | 0.3799 | 0.6825 | 0.5793 | 0.7786 | 0.6532 | 0.7771 |
0.7981 | 19.0 | 1140 | 0.8285 | 0.5488 | 0.8236 | 0.6288 | -1.0 | 0.4448 | 0.5802 | 0.4215 | 0.7103 | 0.7608 | -1.0 | 0.6542 | 0.7699 | 0.4209 | 0.7175 | 0.574 | 0.7762 | 0.6513 | 0.7886 |
0.7981 | 20.0 | 1200 | 0.8036 | 0.5544 | 0.8123 | 0.6339 | -1.0 | 0.4699 | 0.5869 | 0.4227 | 0.7209 | 0.7735 | -1.0 | 0.625 | 0.7859 | 0.4012 | 0.7175 | 0.5806 | 0.8 | 0.6815 | 0.8029 |
0.7981 | 21.0 | 1260 | 0.8163 | 0.5546 | 0.8194 | 0.6187 | -1.0 | 0.4976 | 0.5843 | 0.426 | 0.7134 | 0.7648 | -1.0 | 0.6083 | 0.781 | 0.3824 | 0.6925 | 0.6011 | 0.8048 | 0.6803 | 0.7971 |
0.7981 | 22.0 | 1320 | 0.8323 | 0.5608 | 0.8266 | 0.6316 | -1.0 | 0.5279 | 0.5848 | 0.4161 | 0.711 | 0.7573 | -1.0 | 0.6083 | 0.7706 | 0.4091 | 0.6975 | 0.5902 | 0.7857 | 0.6831 | 0.7886 |
0.7981 | 23.0 | 1380 | 0.8178 | 0.5621 | 0.83 | 0.6621 | -1.0 | 0.4861 | 0.5881 | 0.4194 | 0.7124 | 0.7578 | -1.0 | 0.6125 | 0.7707 | 0.4356 | 0.71 | 0.5775 | 0.7833 | 0.6733 | 0.78 |
0.7981 | 24.0 | 1440 | 0.8000 | 0.5615 | 0.8331 | 0.66 | -1.0 | 0.5107 | 0.5872 | 0.4135 | 0.7153 | 0.7615 | -1.0 | 0.5917 | 0.7765 | 0.4259 | 0.725 | 0.5974 | 0.7738 | 0.6611 | 0.7857 |
0.5872 | 25.0 | 1500 | 0.7918 | 0.5691 | 0.8323 | 0.6611 | -1.0 | 0.5043 | 0.5945 | 0.4271 | 0.7258 | 0.7671 | -1.0 | 0.6 | 0.7824 | 0.4274 | 0.7175 | 0.5935 | 0.781 | 0.6863 | 0.8029 |
0.5872 | 26.0 | 1560 | 0.7879 | 0.5846 | 0.839 | 0.674 | -1.0 | 0.4845 | 0.611 | 0.4234 | 0.7313 | 0.7656 | -1.0 | 0.6208 | 0.7789 | 0.457 | 0.7125 | 0.6081 | 0.7786 | 0.6888 | 0.8057 |
0.5872 | 27.0 | 1620 | 0.7810 | 0.5793 | 0.8423 | 0.664 | -1.0 | 0.485 | 0.6038 | 0.4285 | 0.7251 | 0.7736 | -1.0 | 0.6167 | 0.7865 | 0.4498 | 0.735 | 0.6025 | 0.7857 | 0.6857 | 0.8 |
0.5872 | 28.0 | 1680 | 0.7838 | 0.5779 | 0.8359 | 0.6719 | -1.0 | 0.5125 | 0.6044 | 0.424 | 0.7256 | 0.7666 | -1.0 | 0.6125 | 0.7803 | 0.4494 | 0.725 | 0.6017 | 0.7833 | 0.6827 | 0.7914 |
0.5872 | 29.0 | 1740 | 0.7841 | 0.5776 | 0.8363 | 0.6718 | -1.0 | 0.5125 | 0.604 | 0.4248 | 0.7276 | 0.7678 | -1.0 | 0.6125 | 0.782 | 0.4479 | 0.72 | 0.6019 | 0.7833 | 0.6829 | 0.8 |
0.5872 | 30.0 | 1800 | 0.7836 | 0.5785 | 0.8356 | 0.6723 | -1.0 | 0.5125 | 0.605 | 0.4248 | 0.7284 | 0.7686 | -1.0 | 0.6125 | 0.7829 | 0.448 | 0.72 | 0.6045 | 0.7857 | 0.6831 | 0.8 |
Framework versions
- Transformers 4.49.0
- Pytorch 2.6.0+cu124
- Datasets 3.4.1
- Tokenizers 0.21.1
- Downloads last month
- 22
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for joheras/yolo_finetuned_fruits
Base model
hustvl/yolos-tiny