Version_weird_ASAP_FineTuningBERT_AugV12_k4_task1_organization_k4_k4_fold3
This model is a fine-tuned version of bert-base-uncased on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.2919
- Qwk: 0.5117
- Mse: 1.2915
- Rmse: 1.1365
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 100
Training results
Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
---|---|---|---|---|---|---|
No log | 1.0 | 2 | 11.1127 | 0.0096 | 11.1108 | 3.3333 |
No log | 2.0 | 4 | 9.7402 | 0.0 | 9.7383 | 3.1206 |
No log | 3.0 | 6 | 8.6794 | 0.0 | 8.6776 | 2.9458 |
No log | 4.0 | 8 | 7.6563 | 0.0 | 7.6546 | 2.7667 |
No log | 5.0 | 10 | 6.2389 | 0.0113 | 6.2375 | 2.4975 |
No log | 6.0 | 12 | 4.8836 | 0.0110 | 4.8824 | 2.2096 |
No log | 7.0 | 14 | 4.4741 | 0.0 | 4.4728 | 2.1149 |
No log | 8.0 | 16 | 2.7779 | 0.0067 | 2.7770 | 1.6664 |
No log | 9.0 | 18 | 2.5292 | -0.0208 | 2.5283 | 1.5901 |
No log | 10.0 | 20 | 2.2690 | 0.1246 | 2.2681 | 1.5060 |
No log | 11.0 | 22 | 2.2322 | 0.1028 | 2.2314 | 1.4938 |
No log | 12.0 | 24 | 1.4297 | 0.0102 | 1.4291 | 1.1955 |
No log | 13.0 | 26 | 1.0374 | 0.0202 | 1.0369 | 1.0183 |
No log | 14.0 | 28 | 0.9199 | 0.3044 | 0.9194 | 0.9589 |
No log | 15.0 | 30 | 0.8441 | 0.1125 | 0.8436 | 0.9185 |
No log | 16.0 | 32 | 0.8408 | 0.0553 | 0.8405 | 0.9168 |
No log | 17.0 | 34 | 0.8872 | 0.0468 | 0.8869 | 0.9417 |
No log | 18.0 | 36 | 1.1041 | 0.0454 | 1.1039 | 1.0507 |
No log | 19.0 | 38 | 1.2496 | 0.1184 | 1.2494 | 1.1178 |
No log | 20.0 | 40 | 1.3955 | 0.1701 | 1.3953 | 1.1812 |
No log | 21.0 | 42 | 1.4742 | 0.1279 | 1.4741 | 1.2141 |
No log | 22.0 | 44 | 1.8307 | 0.1136 | 1.8306 | 1.3530 |
No log | 23.0 | 46 | 1.8743 | 0.1540 | 1.8744 | 1.3691 |
No log | 24.0 | 48 | 1.3418 | 0.2499 | 1.3421 | 1.1585 |
No log | 25.0 | 50 | 1.4000 | 0.3122 | 1.4002 | 1.1833 |
No log | 26.0 | 52 | 0.7883 | 0.4514 | 0.7886 | 0.8880 |
No log | 27.0 | 54 | 0.8842 | 0.4562 | 0.8845 | 0.9405 |
No log | 28.0 | 56 | 0.8091 | 0.4782 | 0.8095 | 0.8997 |
No log | 29.0 | 58 | 0.8671 | 0.5193 | 0.8674 | 0.9314 |
No log | 30.0 | 60 | 1.1537 | 0.4649 | 1.1538 | 1.0741 |
No log | 31.0 | 62 | 0.8885 | 0.5334 | 0.8885 | 0.9426 |
No log | 32.0 | 64 | 0.6855 | 0.5952 | 0.6856 | 0.8280 |
No log | 33.0 | 66 | 0.8459 | 0.5718 | 0.8458 | 0.9197 |
No log | 34.0 | 68 | 1.3321 | 0.4741 | 1.3317 | 1.1540 |
No log | 35.0 | 70 | 1.0430 | 0.5254 | 1.0428 | 1.0212 |
No log | 36.0 | 72 | 0.8966 | 0.5762 | 0.8966 | 0.9469 |
No log | 37.0 | 74 | 0.9836 | 0.5639 | 0.9835 | 0.9917 |
No log | 38.0 | 76 | 1.2988 | 0.5030 | 1.2984 | 1.1395 |
No log | 39.0 | 78 | 1.3520 | 0.4863 | 1.3516 | 1.1626 |
No log | 40.0 | 80 | 0.9535 | 0.5621 | 0.9534 | 0.9764 |
No log | 41.0 | 82 | 0.9675 | 0.5621 | 0.9673 | 0.9835 |
No log | 42.0 | 84 | 1.1669 | 0.5255 | 1.1667 | 1.0801 |
No log | 43.0 | 86 | 1.0110 | 0.5385 | 1.0108 | 1.0054 |
No log | 44.0 | 88 | 1.1482 | 0.5155 | 1.1481 | 1.0715 |
No log | 45.0 | 90 | 1.3793 | 0.4934 | 1.3790 | 1.1743 |
No log | 46.0 | 92 | 1.3271 | 0.4976 | 1.3267 | 1.1518 |
No log | 47.0 | 94 | 1.0184 | 0.5481 | 1.0183 | 1.0091 |
No log | 48.0 | 96 | 0.9180 | 0.5581 | 0.9179 | 0.9581 |
No log | 49.0 | 98 | 1.0982 | 0.5360 | 1.0980 | 1.0478 |
No log | 50.0 | 100 | 1.1822 | 0.5233 | 1.1819 | 1.0871 |
No log | 51.0 | 102 | 1.1085 | 0.5364 | 1.1082 | 1.0527 |
No log | 52.0 | 104 | 1.2919 | 0.5117 | 1.2915 | 1.1365 |
Framework versions
- Transformers 4.47.0
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 46
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for genki10/Version_weird_ASAP_FineTuningBERT_AugV12_k4_task1_organization_k4_k4_fold3
Base model
google-bert/bert-base-uncased