SST-2-FULL_FT-seed52
This model is a fine-tuned version of roberta-base on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.2609
- Accuracy: 0.9461
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
0.3866 | 0.0950 | 200 | 0.2197 | 0.9140 |
0.2929 | 0.1900 | 400 | 0.1937 | 0.9289 |
0.2641 | 0.2850 | 600 | 0.1987 | 0.9243 |
0.2327 | 0.3800 | 800 | 0.2481 | 0.9209 |
0.2165 | 0.4751 | 1000 | 0.2646 | 0.9083 |
0.2168 | 0.5701 | 1200 | 0.2851 | 0.9094 |
0.2103 | 0.6651 | 1400 | 0.1918 | 0.9323 |
0.2088 | 0.7601 | 1600 | 0.2156 | 0.9289 |
0.2078 | 0.8551 | 1800 | 0.1856 | 0.9415 |
0.1839 | 0.9501 | 2000 | 0.2036 | 0.9312 |
0.1781 | 1.0451 | 2200 | 0.2492 | 0.9289 |
0.1423 | 1.1401 | 2400 | 0.2348 | 0.9335 |
0.1513 | 1.2352 | 2600 | 0.3199 | 0.9094 |
0.1426 | 1.3302 | 2800 | 0.3110 | 0.9163 |
0.141 | 1.4252 | 3000 | 0.2409 | 0.9346 |
0.1397 | 1.5202 | 3200 | 0.2275 | 0.9323 |
0.1515 | 1.6152 | 3400 | 0.1913 | 0.9346 |
0.1299 | 1.7102 | 3600 | 0.2218 | 0.9346 |
0.1369 | 1.8052 | 3800 | 0.1840 | 0.9438 |
0.1603 | 1.9002 | 4000 | 0.1929 | 0.9392 |
0.1414 | 1.9952 | 4200 | 0.2499 | 0.9300 |
0.1054 | 2.0903 | 4400 | 0.2797 | 0.9323 |
0.1093 | 2.1853 | 4600 | 0.2857 | 0.9323 |
0.1053 | 2.2803 | 4800 | 0.2706 | 0.9358 |
0.1098 | 2.3753 | 5000 | 0.1876 | 0.9392 |
0.1013 | 2.4703 | 5200 | 0.2995 | 0.9312 |
0.1044 | 2.5653 | 5400 | 0.2438 | 0.9358 |
0.1175 | 2.6603 | 5600 | 0.2601 | 0.9358 |
0.0996 | 2.7553 | 5800 | 0.2624 | 0.9323 |
0.1041 | 2.8504 | 6000 | 0.2613 | 0.9335 |
0.1 | 2.9454 | 6200 | 0.2270 | 0.9404 |
0.0967 | 3.0404 | 6400 | 0.2532 | 0.9415 |
0.0765 | 3.1354 | 6600 | 0.2480 | 0.9369 |
0.0786 | 3.2304 | 6800 | 0.2748 | 0.9392 |
0.0847 | 3.3254 | 7000 | 0.2463 | 0.9381 |
0.0771 | 3.4204 | 7200 | 0.2602 | 0.9404 |
0.0782 | 3.5154 | 7400 | 0.2752 | 0.9392 |
0.0784 | 3.6105 | 7600 | 0.2609 | 0.9461 |
0.0843 | 3.7055 | 7800 | 0.2155 | 0.9461 |
0.0798 | 3.8005 | 8000 | 0.2697 | 0.9404 |
0.069 | 3.8955 | 8200 | 0.2685 | 0.9392 |
0.0828 | 3.9905 | 8400 | 0.2371 | 0.9438 |
0.0523 | 4.0855 | 8600 | 0.3272 | 0.9323 |
0.0497 | 4.1805 | 8800 | 0.3200 | 0.9358 |
0.0559 | 4.2755 | 9000 | 0.3128 | 0.9392 |
0.0578 | 4.3705 | 9200 | 0.3078 | 0.9415 |
0.0606 | 4.4656 | 9400 | 0.2961 | 0.9369 |
0.0543 | 4.5606 | 9600 | 0.2839 | 0.9450 |
0.0635 | 4.6556 | 9800 | 0.2774 | 0.9392 |
0.06 | 4.7506 | 10000 | 0.2753 | 0.9404 |
0.0542 | 4.8456 | 10200 | 0.2896 | 0.9381 |
0.0538 | 4.9406 | 10400 | 0.2905 | 0.9392 |
Framework versions
- Transformers 4.54.1
- Pytorch 2.5.1+cu121
- Datasets 4.0.0
- Tokenizers 0.21.4
- Downloads last month
- 58
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for ekiprop/SST-2-FULL_FT-seed52
Base model
FacebookAI/roberta-base