relevance-analysis
This model is a fine-tuned version of mdhugol/indonesia-bert-sentiment-classification on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.5308
- Accuracy: 0.8230
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-08
- train_batch_size: 8
- eval_batch_size: 8
- seed: 41
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 200
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
3.4407 | 1.7986 | 500 | 3.4936 | 0.2 |
3.0684 | 3.5971 | 1000 | 3.1423 | 0.2162 |
2.7449 | 5.3957 | 1500 | 2.7929 | 0.2243 |
2.4066 | 7.1942 | 2000 | 2.4499 | 0.2459 |
2.0853 | 8.9928 | 2500 | 2.1284 | 0.2838 |
1.8021 | 10.7914 | 3000 | 1.8369 | 0.3514 |
1.546 | 12.5899 | 3500 | 1.5842 | 0.4176 |
1.3521 | 14.3885 | 4000 | 1.3852 | 0.4811 |
1.1889 | 16.1871 | 4500 | 1.2250 | 0.5392 |
1.0671 | 17.9856 | 5000 | 1.1067 | 0.5824 |
0.9838 | 19.7842 | 5500 | 1.0203 | 0.6365 |
0.8825 | 21.5827 | 6000 | 0.9541 | 0.6757 |
0.846 | 23.3813 | 6500 | 0.9053 | 0.7027 |
0.8263 | 25.1799 | 7000 | 0.8669 | 0.7284 |
0.7888 | 26.9784 | 7500 | 0.8360 | 0.75 |
0.7516 | 28.7770 | 8000 | 0.8106 | 0.7608 |
0.7417 | 30.5755 | 8500 | 0.7894 | 0.7703 |
0.7277 | 32.3741 | 9000 | 0.7702 | 0.7797 |
0.7217 | 34.1727 | 9500 | 0.7532 | 0.7878 |
0.694 | 35.9712 | 10000 | 0.7382 | 0.7932 |
0.673 | 37.7698 | 10500 | 0.7249 | 0.7973 |
0.6955 | 39.5683 | 11000 | 0.7124 | 0.7986 |
0.6544 | 41.3669 | 11500 | 0.7013 | 0.8027 |
0.6548 | 43.1655 | 12000 | 0.6906 | 0.8068 |
0.6355 | 44.9640 | 12500 | 0.6811 | 0.8108 |
0.6386 | 46.7626 | 13000 | 0.6720 | 0.8122 |
0.627 | 48.5612 | 13500 | 0.6637 | 0.8122 |
0.6199 | 50.3597 | 14000 | 0.6559 | 0.8122 |
0.6291 | 52.1583 | 14500 | 0.6487 | 0.8122 |
0.5938 | 53.9568 | 15000 | 0.6422 | 0.8122 |
0.603 | 55.7554 | 15500 | 0.6358 | 0.8135 |
0.594 | 57.5540 | 16000 | 0.6298 | 0.8162 |
0.5931 | 59.3525 | 16500 | 0.6243 | 0.8176 |
0.5865 | 61.1511 | 17000 | 0.6189 | 0.8176 |
0.5797 | 62.9496 | 17500 | 0.6143 | 0.8176 |
0.5764 | 64.7482 | 18000 | 0.6099 | 0.8176 |
0.5856 | 66.5468 | 18500 | 0.6054 | 0.8176 |
0.5513 | 68.3453 | 19000 | 0.6019 | 0.8176 |
0.5823 | 70.1439 | 19500 | 0.5978 | 0.8176 |
0.5588 | 71.9424 | 20000 | 0.5945 | 0.8176 |
0.5571 | 73.7410 | 20500 | 0.5913 | 0.8189 |
0.5722 | 75.5396 | 21000 | 0.5882 | 0.8189 |
0.5507 | 77.3381 | 21500 | 0.5853 | 0.8189 |
0.5524 | 79.1367 | 22000 | 0.5827 | 0.8189 |
0.5487 | 80.9353 | 22500 | 0.5800 | 0.8189 |
0.545 | 82.7338 | 23000 | 0.5776 | 0.8189 |
0.5465 | 84.5324 | 23500 | 0.5754 | 0.8189 |
0.5645 | 86.3309 | 24000 | 0.5730 | 0.8189 |
0.5195 | 88.1295 | 24500 | 0.5712 | 0.8189 |
0.5405 | 89.9281 | 25000 | 0.5692 | 0.8189 |
0.5331 | 91.7266 | 25500 | 0.5674 | 0.8189 |
0.5384 | 93.5252 | 26000 | 0.5657 | 0.8189 |
0.5407 | 95.3237 | 26500 | 0.5639 | 0.8189 |
0.5368 | 97.1223 | 27000 | 0.5623 | 0.8189 |
0.5254 | 98.9209 | 27500 | 0.5607 | 0.8189 |
0.5327 | 100.7194 | 28000 | 0.5592 | 0.8189 |
0.5309 | 102.5180 | 28500 | 0.5578 | 0.8203 |
0.5309 | 104.3165 | 29000 | 0.5564 | 0.8203 |
0.516 | 106.1151 | 29500 | 0.5550 | 0.8203 |
0.5325 | 107.9137 | 30000 | 0.5537 | 0.8203 |
0.5239 | 109.7122 | 30500 | 0.5525 | 0.8203 |
0.5122 | 111.5108 | 31000 | 0.5514 | 0.8203 |
0.5309 | 113.3094 | 31500 | 0.5502 | 0.8203 |
0.5185 | 115.1079 | 32000 | 0.5491 | 0.8203 |
0.5209 | 116.9065 | 32500 | 0.5480 | 0.8203 |
0.5101 | 118.7050 | 33000 | 0.5470 | 0.8203 |
0.5063 | 120.5036 | 33500 | 0.5461 | 0.8203 |
0.5241 | 122.3022 | 34000 | 0.5452 | 0.8203 |
0.5056 | 124.1007 | 34500 | 0.5444 | 0.8203 |
0.5122 | 125.8993 | 35000 | 0.5435 | 0.8203 |
0.5065 | 127.6978 | 35500 | 0.5428 | 0.8203 |
0.5142 | 129.4964 | 36000 | 0.5419 | 0.8189 |
0.5115 | 131.2950 | 36500 | 0.5412 | 0.8189 |
0.5158 | 133.0935 | 37000 | 0.5405 | 0.8189 |
0.5076 | 134.8921 | 37500 | 0.5398 | 0.8203 |
0.5079 | 136.6906 | 38000 | 0.5393 | 0.8203 |
0.5014 | 138.4892 | 38500 | 0.5386 | 0.8203 |
0.511 | 140.2878 | 39000 | 0.5380 | 0.8203 |
0.5035 | 142.0863 | 39500 | 0.5375 | 0.8203 |
0.5008 | 143.8849 | 40000 | 0.5370 | 0.8203 |
0.5052 | 145.6835 | 40500 | 0.5365 | 0.8216 |
0.508 | 147.4820 | 41000 | 0.5360 | 0.8216 |
0.4975 | 149.2806 | 41500 | 0.5356 | 0.8216 |
0.5102 | 151.0791 | 42000 | 0.5351 | 0.8216 |
0.5092 | 152.8777 | 42500 | 0.5346 | 0.8216 |
0.4985 | 154.6763 | 43000 | 0.5343 | 0.8216 |
0.4997 | 156.4748 | 43500 | 0.5339 | 0.8216 |
0.5058 | 158.2734 | 44000 | 0.5336 | 0.8216 |
0.4978 | 160.0719 | 44500 | 0.5333 | 0.8230 |
0.4983 | 161.8705 | 45000 | 0.5330 | 0.8230 |
0.5097 | 163.6691 | 45500 | 0.5327 | 0.8230 |
0.4966 | 165.4676 | 46000 | 0.5325 | 0.8230 |
0.5003 | 167.2662 | 46500 | 0.5322 | 0.8230 |
0.4967 | 169.0647 | 47000 | 0.5320 | 0.8230 |
0.497 | 170.8633 | 47500 | 0.5318 | 0.8230 |
0.5108 | 172.6619 | 48000 | 0.5316 | 0.8230 |
0.4891 | 174.4604 | 48500 | 0.5315 | 0.8230 |
0.4967 | 176.2590 | 49000 | 0.5314 | 0.8230 |
0.5029 | 178.0576 | 49500 | 0.5313 | 0.8230 |
0.5023 | 179.8561 | 50000 | 0.5312 | 0.8230 |
0.5007 | 181.6547 | 50500 | 0.5311 | 0.8230 |
0.4966 | 183.4532 | 51000 | 0.5310 | 0.8230 |
0.4921 | 185.2518 | 51500 | 0.5310 | 0.8230 |
0.4966 | 187.0504 | 52000 | 0.5309 | 0.8230 |
0.5056 | 188.8489 | 52500 | 0.5309 | 0.8230 |
0.4888 | 190.6475 | 53000 | 0.5309 | 0.8230 |
0.4995 | 192.4460 | 53500 | 0.5309 | 0.8230 |
0.4976 | 194.2446 | 54000 | 0.5309 | 0.8230 |
0.4942 | 196.0432 | 54500 | 0.5308 | 0.8230 |
0.4993 | 197.8417 | 55000 | 0.5308 | 0.8230 |
0.4988 | 199.6403 | 55500 | 0.5308 | 0.8230 |
Framework versions
- Transformers 4.46.1
- Pytorch 2.5.1+cu124
- Datasets 3.1.0
- Tokenizers 0.20.1
- Downloads last month
- 35
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.