Amoros_Beaugosse_test-large-2025_05_21_36883-bs16_freeze
This model is a fine-tuned version of facebook/dinov2-large on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.0853
- F1 Micro: 0.6248
- F1 Macro: 0.4593
- Accuracy: 0.5213
- Learning Rate: 0.0000
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | F1 Micro | F1 Macro | Accuracy | Rate |
---|---|---|---|---|---|---|---|
0.1213 | 1.0 | 1953 | 0.1046 | 0.4612 | 0.2564 | 0.3234 | 0.001 |
0.1152 | 2.0 | 3906 | 0.1038 | 0.4901 | 0.2770 | 0.3573 | 0.001 |
0.1147 | 3.0 | 5859 | 0.1009 | 0.4919 | 0.2707 | 0.3565 | 0.001 |
0.1141 | 4.0 | 7812 | 0.1008 | 0.4977 | 0.2798 | 0.3596 | 0.001 |
0.1127 | 5.0 | 9765 | 0.1019 | 0.5050 | 0.2555 | 0.3705 | 0.001 |
0.1131 | 6.0 | 11718 | 0.1001 | 0.5039 | 0.2749 | 0.3700 | 0.001 |
0.1133 | 7.0 | 13671 | 0.1004 | 0.5146 | 0.3024 | 0.3790 | 0.001 |
0.1128 | 8.0 | 15624 | 0.1004 | 0.5112 | 0.2921 | 0.3798 | 0.001 |
0.1119 | 9.0 | 17577 | 0.0984 | 0.5345 | 0.2850 | 0.4081 | 0.001 |
0.1145 | 10.0 | 19530 | 0.1012 | 0.5173 | 0.3044 | 0.3820 | 0.001 |
0.1128 | 11.0 | 21483 | 0.0998 | 0.5087 | 0.2539 | 0.3724 | 0.001 |
0.1131 | 12.0 | 23436 | 0.0987 | 0.5197 | 0.2886 | 0.3841 | 0.001 |
0.1109 | 13.0 | 25389 | 0.0980 | 0.5060 | 0.2898 | 0.3651 | 0.001 |
0.1122 | 14.0 | 27342 | 0.0994 | 0.5217 | 0.2700 | 0.3878 | 0.001 |
0.113 | 15.0 | 29295 | 0.1008 | 0.5238 | 0.2805 | 0.3908 | 0.001 |
0.1135 | 16.0 | 31248 | 0.0981 | 0.5208 | 0.2764 | 0.3859 | 0.001 |
0.1137 | 17.0 | 33201 | 0.0978 | 0.5259 | 0.2954 | 0.3905 | 0.001 |
0.114 | 18.0 | 35154 | 0.0994 | 0.5162 | 0.2774 | 0.3836 | 0.001 |
0.1143 | 19.0 | 37107 | 0.0976 | 0.5134 | 0.2685 | 0.3742 | 0.001 |
0.1144 | 20.0 | 39060 | 0.0979 | 0.5194 | 0.2862 | 0.3801 | 0.001 |
0.1154 | 21.0 | 41013 | 0.1040 | 0.5174 | 0.2748 | 0.3874 | 0.001 |
0.1133 | 22.0 | 42966 | 0.0971 | 0.5323 | 0.3003 | 0.4011 | 0.001 |
0.114 | 23.0 | 44919 | 0.0982 | 0.5241 | 0.2923 | 0.3912 | 0.001 |
0.1113 | 24.0 | 46872 | 0.0985 | 0.5161 | 0.2783 | 0.3793 | 0.001 |
0.1132 | 25.0 | 48825 | 0.0996 | 0.5160 | 0.2651 | 0.3792 | 0.001 |
0.1127 | 26.0 | 50778 | 0.0987 | 0.5367 | 0.2911 | 0.4103 | 0.001 |
0.1135 | 27.0 | 52731 | 0.0987 | 0.5277 | 0.2785 | 0.3952 | 0.001 |
0.1133 | 28.0 | 54684 | 0.1018 | 0.5285 | 0.2633 | 0.4042 | 0.001 |
0.1064 | 29.0 | 56637 | 0.0944 | 0.5590 | 0.3221 | 0.4321 | 0.0001 |
0.1063 | 30.0 | 58590 | 0.0929 | 0.5714 | 0.3412 | 0.4493 | 0.0001 |
0.1042 | 31.0 | 60543 | 0.0918 | 0.5832 | 0.3729 | 0.4619 | 0.0001 |
0.1038 | 32.0 | 62496 | 0.0912 | 0.5768 | 0.3565 | 0.4528 | 0.0001 |
0.103 | 33.0 | 64449 | 0.0904 | 0.5873 | 0.3712 | 0.4670 | 0.0001 |
0.1036 | 34.0 | 66402 | 0.0903 | 0.5843 | 0.3731 | 0.4594 | 0.0001 |
0.1047 | 35.0 | 68355 | 0.0903 | 0.5869 | 0.3827 | 0.4621 | 0.0001 |
0.1026 | 36.0 | 70308 | 0.0897 | 0.5943 | 0.3954 | 0.4729 | 0.0001 |
0.1028 | 37.0 | 72261 | 0.0893 | 0.5983 | 0.3983 | 0.4794 | 0.0001 |
0.1022 | 38.0 | 74214 | 0.0892 | 0.5997 | 0.4080 | 0.4820 | 0.0001 |
0.1012 | 39.0 | 76167 | 0.0889 | 0.6008 | 0.3995 | 0.4822 | 0.0001 |
0.1022 | 40.0 | 78120 | 0.0884 | 0.6030 | 0.4085 | 0.4843 | 0.0001 |
0.1002 | 41.0 | 80073 | 0.0888 | 0.5940 | 0.3896 | 0.4700 | 0.0001 |
0.1008 | 42.0 | 82026 | 0.0880 | 0.6024 | 0.4174 | 0.4807 | 0.0001 |
0.0991 | 43.0 | 83979 | 0.0882 | 0.6065 | 0.4184 | 0.4895 | 0.0001 |
0.1005 | 44.0 | 85932 | 0.0881 | 0.6047 | 0.4126 | 0.4860 | 0.0001 |
0.1001 | 45.0 | 87885 | 0.0878 | 0.6090 | 0.4236 | 0.4940 | 0.0001 |
0.0989 | 46.0 | 89838 | 0.0875 | 0.6053 | 0.4205 | 0.4862 | 0.0001 |
0.0996 | 47.0 | 91791 | 0.0873 | 0.6069 | 0.4215 | 0.4880 | 0.0001 |
0.0986 | 48.0 | 93744 | 0.0869 | 0.6110 | 0.4260 | 0.4938 | 0.0001 |
0.0988 | 49.0 | 95697 | 0.0872 | 0.6114 | 0.4155 | 0.4966 | 0.0001 |
0.0994 | 50.0 | 97650 | 0.0871 | 0.6136 | 0.4336 | 0.4974 | 0.0001 |
0.0998 | 51.0 | 99603 | 0.0872 | 0.6104 | 0.4271 | 0.4947 | 0.0001 |
0.0992 | 52.0 | 101556 | 0.0871 | 0.6109 | 0.4289 | 0.4951 | 0.0001 |
0.0988 | 53.0 | 103509 | 0.0871 | 0.6137 | 0.4268 | 0.4990 | 0.0001 |
0.0977 | 54.0 | 105462 | 0.0867 | 0.6147 | 0.4405 | 0.5006 | 0.0001 |
0.0987 | 55.0 | 107415 | 0.0872 | 0.6131 | 0.4344 | 0.4986 | 0.0001 |
0.0985 | 56.0 | 109368 | 0.0865 | 0.6122 | 0.4268 | 0.4959 | 0.0001 |
0.0982 | 57.0 | 111321 | 0.0864 | 0.6177 | 0.4405 | 0.5041 | 0.0001 |
0.0997 | 58.0 | 113274 | 0.0865 | 0.6104 | 0.4243 | 0.4962 | 0.0001 |
0.0982 | 59.0 | 115227 | 0.0865 | 0.6165 | 0.4318 | 0.5044 | 0.0001 |
0.0973 | 60.0 | 117180 | 0.0868 | 0.6140 | 0.4458 | 0.4993 | 0.0001 |
0.0974 | 61.0 | 119133 | 0.0872 | 0.6115 | 0.4204 | 0.4940 | 0.0001 |
0.0976 | 62.0 | 121086 | 0.0862 | 0.6121 | 0.4359 | 0.4944 | 0.0001 |
0.0977 | 63.0 | 123039 | 0.0865 | 0.6087 | 0.4275 | 0.4900 | 0.0001 |
0.0968 | 64.0 | 124992 | 0.0862 | 0.6156 | 0.4383 | 0.5010 | 0.0001 |
0.0971 | 65.0 | 126945 | 0.0864 | 0.6151 | 0.4484 | 0.5009 | 0.0001 |
0.0968 | 66.0 | 128898 | 0.0857 | 0.6216 | 0.4437 | 0.5099 | 0.0001 |
0.0962 | 67.0 | 130851 | 0.0859 | 0.6186 | 0.4409 | 0.5051 | 0.0001 |
0.0977 | 68.0 | 132804 | 0.0859 | 0.6186 | 0.4427 | 0.5049 | 0.0001 |
0.0969 | 69.0 | 134757 | 0.0863 | 0.6197 | 0.4344 | 0.5109 | 0.0001 |
0.0972 | 70.0 | 136710 | 0.0858 | 0.6242 | 0.4471 | 0.5158 | 0.0001 |
0.0961 | 71.0 | 138663 | 0.0858 | 0.6219 | 0.4529 | 0.5104 | 0.0001 |
0.0972 | 72.0 | 140616 | 0.0858 | 0.6215 | 0.4392 | 0.5131 | 0.0001 |
0.0948 | 73.0 | 142569 | 0.0852 | 0.6232 | 0.4489 | 0.5112 | 1e-05 |
0.0956 | 74.0 | 144522 | 0.0850 | 0.6263 | 0.4569 | 0.5188 | 1e-05 |
0.0928 | 75.0 | 146475 | 0.0848 | 0.6273 | 0.4543 | 0.5189 | 1e-05 |
0.0963 | 76.0 | 148428 | 0.0847 | 0.6299 | 0.4604 | 0.5244 | 1e-05 |
0.0946 | 77.0 | 150381 | 0.0846 | 0.6272 | 0.4473 | 0.5187 | 1e-05 |
0.0947 | 78.0 | 152334 | 0.0845 | 0.6325 | 0.4540 | 0.5271 | 1e-05 |
0.0939 | 79.0 | 154287 | 0.0846 | 0.6245 | 0.4471 | 0.5143 | 1e-05 |
0.0938 | 80.0 | 156240 | 0.0846 | 0.6264 | 0.4489 | 0.5169 | 1e-05 |
0.0946 | 81.0 | 158193 | 0.0844 | 0.6299 | 0.4637 | 0.5227 | 1e-05 |
0.0937 | 82.0 | 160146 | 0.0844 | 0.6307 | 0.4596 | 0.5253 | 1e-05 |
0.094 | 83.0 | 162099 | 0.0844 | 0.6289 | 0.4588 | 0.5218 | 1e-05 |
0.0957 | 84.0 | 164052 | 0.0841 | 0.6286 | 0.4608 | 0.5221 | 1e-05 |
0.0934 | 85.0 | 166005 | 0.0844 | 0.6270 | 0.4572 | 0.5195 | 1e-05 |
0.0932 | 86.0 | 167958 | 0.0841 | 0.6310 | 0.4637 | 0.5262 | 1e-05 |
0.0938 | 87.0 | 169911 | 0.0844 | 0.6282 | 0.4543 | 0.5218 | 1e-05 |
0.0926 | 88.0 | 171864 | 0.0844 | 0.6318 | 0.4613 | 0.5267 | 1e-05 |
0.0937 | 89.0 | 173817 | 0.0843 | 0.6260 | 0.4494 | 0.5152 | 1e-05 |
0.0925 | 90.0 | 175770 | 0.0844 | 0.6243 | 0.4515 | 0.5140 | 1e-05 |
0.0937 | 91.0 | 177723 | 0.0841 | 0.6327 | 0.4606 | 0.5276 | 1e-05 |
0.0924 | 92.0 | 179676 | 0.0841 | 0.6301 | 0.4568 | 0.5226 | 1e-05 |
0.0923 | 93.0 | 181629 | 0.0839 | 0.6317 | 0.4593 | 0.5272 | 0.0000 |
0.0929 | 94.0 | 183582 | 0.0841 | 0.6301 | 0.4572 | 0.5234 | 0.0000 |
0.0924 | 95.0 | 185535 | 0.0844 | 0.6308 | 0.4614 | 0.5248 | 0.0000 |
0.0937 | 96.0 | 187488 | 0.0839 | 0.6314 | 0.4611 | 0.5251 | 0.0000 |
0.0926 | 97.0 | 189441 | 0.0843 | 0.6296 | 0.4546 | 0.5227 | 0.0000 |
0.0951 | 98.0 | 191394 | 0.0842 | 0.6294 | 0.4556 | 0.5217 | 0.0000 |
0.0928 | 99.0 | 193347 | 0.0840 | 0.6301 | 0.4615 | 0.5236 | 0.0000 |
0.093 | 100.0 | 195300 | 0.0842 | 0.6319 | 0.4612 | 0.5263 | 0.0000 |
Framework versions
- Transformers 4.48.0
- Pytorch 2.6.0+cu118
- Datasets 3.0.2
- Tokenizers 0.21.1
- Downloads last month
- 25
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for Amoros/Amoros_Beaugosse_test-large-2025_05_21_36883-bs16_freeze
Base model
facebook/dinov2-large