metadata
library_name: transformers
license: apache-2.0
base_model: microsoft/swin-small-patch4-window7-224
tags:
- generated_from_trainer
metrics:
- accuracy
- precision
- f1
model-index:
- name: swin-small-patch4-window7-224
results: []
swin-small-patch4-window7-224
This model is a fine-tuned version of microsoft/swin-small-patch4-window7-224 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1892
- Accuracy: 0.9778
- Precision: 0.9804
- Sensitivity: 1.0
- Specificity: 0.9733
- F1: 0.9783
- Auc: 0.9909
- Mcc: 0.9267
- J Stat: 0.9733
- Confusion Matrix: [[146, 4], [0, 30]]
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 8.068420130106194e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.06206076180207542
- num_epochs: 10
- label_smoothing_factor: 0.06417838785936565
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Sensitivity | Specificity | F1 | Auc | Mcc | J Stat | Confusion Matrix |
---|---|---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 47 | 0.2904 | 0.9228 | 0.9256 | 0.7333 | 0.9892 | 0.9191 | 0.9754 | 0.7944 | 0.7225 | [[1100, 12], [104, 286]] |
0.4216 | 2.0 | 94 | 0.2247 | 0.9634 | 0.9640 | 0.8769 | 0.9937 | 0.9627 | 0.9866 | 0.9038 | 0.8706 | [[1105, 7], [48, 342]] |
0.2859 | 3.0 | 141 | 0.2064 | 0.9800 | 0.9800 | 0.9410 | 0.9937 | 0.9799 | 0.9914 | 0.9477 | 0.9347 | [[1105, 7], [23, 367]] |
0.2231 | 4.0 | 188 | 0.1778 | 0.9847 | 0.9847 | 0.9513 | 0.9964 | 0.9846 | 0.9956 | 0.9600 | 0.9477 | [[1108, 4], [19, 371]] |
0.1986 | 5.0 | 235 | 0.1590 | 0.9933 | 0.9933 | 0.9872 | 0.9955 | 0.9933 | 0.9982 | 0.9827 | 0.9827 | [[1107, 5], [5, 385]] |
0.174 | 6.0 | 282 | 0.1484 | 0.9973 | 0.9973 | 0.9923 | 0.9991 | 0.9973 | 0.9995 | 0.9931 | 0.9914 | [[1111, 1], [3, 387]] |
0.1526 | 7.0 | 329 | 0.1469 | 0.9980 | 0.9980 | 0.9949 | 0.9991 | 0.9980 | 0.9999 | 0.9948 | 0.9940 | [[1111, 1], [2, 388]] |
0.1518 | 8.0 | 376 | 0.1461 | 0.9987 | 0.9987 | 0.9949 | 1.0 | 0.9987 | 0.9997 | 0.9965 | 0.9949 | [[1112, 0], [2, 388]] |
0.148 | 9.0 | 423 | 0.1459 | 0.9987 | 0.9987 | 0.9949 | 1.0 | 0.9987 | 1.0000 | 0.9965 | 0.9949 | [[1112, 0], [2, 388]] |
0.1496 | 10.0 | 470 | 0.1458 | 0.9987 | 0.9987 | 0.9949 | 1.0 | 0.9987 | 1.0000 | 0.9965 | 0.9949 | [[1112, 0], [2, 388]] |
Framework versions
- Transformers 4.52.4
- Pytorch 2.6.0+cu124
- Datasets 4.0.0
- Tokenizers 0.21.2