BTX24's picture
Model save
fa37af7 verified
|
raw
history blame
2.59 kB
metadata
library_name: transformers
license: apache-2.0
base_model: timm/levit_128.fb_dist_in1k
tags:
  - generated_from_trainer
metrics:
  - accuracy
  - f1
  - precision
  - recall
model-index:
  - name: levit_128.fb_dist_in1k-finetuned-stroke-binary
    results: []

levit_128.fb_dist_in1k-finetuned-stroke-binary

This model is a fine-tuned version of timm/levit_128.fb_dist_in1k on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: nan
  • Accuracy: 0.6974
  • F1: 0.7014
  • Precision: 0.7298
  • Recall: 0.6974

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 32
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 36
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Precision Recall
0.695 0.6202 100 nan 0.5364 0.5421 0.5541 0.5364
0.6804 1.2357 200 nan 0.5798 0.5833 0.6278 0.5798
0.6821 1.8558 300 nan 0.6232 0.6280 0.6413 0.6232
0.6726 2.4713 400 nan 0.6671 0.6711 0.6829 0.6671
0.6546 3.0868 500 nan 0.7024 0.7021 0.7018 0.7024
0.647 3.7070 600 nan 0.7065 0.7093 0.7159 0.7065
0.6263 4.3225 700 nan 0.6956 0.6991 0.7096 0.6956
0.6112 4.9426 800 nan 0.6766 0.6807 0.7123 0.6766
0.5704 5.5581 900 nan 0.6974 0.7014 0.7298 0.6974

Framework versions

  • Transformers 4.48.3
  • Pytorch 2.6.0+cu124
  • Datasets 3.4.0
  • Tokenizers 0.21.0