vit-estadosfenologicos

This model is a fine-tuned version of google/vit-base-patch32-384 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1298
  • Accuracy: 0.9581

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0005
  • train_batch_size: 512
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 12 0.7243 0.8397
No log 2.0 24 0.3981 0.8720
No log 3.0 36 0.2737 0.9135
No log 4.0 48 0.2173 0.9250
No log 5.0 60 0.1910 0.9319
No log 6.0 72 0.1757 0.9354
No log 7.0 84 0.1638 0.9412
No log 8.0 96 0.1572 0.9377
0.4042 9.0 108 0.1504 0.9458
0.4042 10.0 120 0.1456 0.9412
0.4042 11.0 132 0.1415 0.9458
0.4042 12.0 144 0.1375 0.9458
0.4042 13.0 156 0.1351 0.9458
0.4042 14.0 168 0.1327 0.9469
0.4042 15.0 180 0.1294 0.9504
0.4042 16.0 192 0.1270 0.9469
0.1655 17.0 204 0.1258 0.9504
0.1655 18.0 216 0.1247 0.9516
0.1655 19.0 228 0.1221 0.9539
0.1655 20.0 240 0.1210 0.9527
0.1655 21.0 252 0.1208 0.9504
0.1655 22.0 264 0.1185 0.9550
0.1655 23.0 276 0.1176 0.9539
0.1655 24.0 288 0.1158 0.9539
0.1371 25.0 300 0.1162 0.9539
0.1371 26.0 312 0.1142 0.9550
0.1371 27.0 324 0.1148 0.9550
0.1371 28.0 336 0.1131 0.9550
0.1371 29.0 348 0.1122 0.9550
0.1371 30.0 360 0.1118 0.9550
0.1371 31.0 372 0.1116 0.9539
0.1371 32.0 384 0.1103 0.9550
0.1371 33.0 396 0.1102 0.9550
0.124 34.0 408 0.1103 0.9550
0.124 35.0 420 0.1089 0.9573
0.124 36.0 432 0.1088 0.9562
0.124 37.0 444 0.1092 0.9550
0.124 38.0 456 0.1079 0.9562
0.124 39.0 468 0.1082 0.9562
0.124 40.0 480 0.1077 0.9562
0.124 41.0 492 0.1071 0.9573
0.1168 42.0 504 0.1068 0.9573
0.1168 43.0 516 0.1073 0.9562
0.1168 44.0 528 0.1067 0.9562
0.1168 45.0 540 0.1066 0.9562
0.1168 46.0 552 0.1062 0.9573
0.1168 47.0 564 0.1063 0.9573
0.1168 48.0 576 0.1063 0.9562
0.1168 49.0 588 0.1063 0.9562
0.1129 50.0 600 0.1063 0.9562

Framework versions

  • Transformers 4.49.0
  • Pytorch 2.6.0+cu126
  • Datasets 3.3.2
  • Tokenizers 0.21.0
Downloads last month
4
Safetensors
Model size
87.5M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Ignaciobfp/vit-estadosfenologicos

Finetuned
(15)
this model