hai2131's picture
End of training
9844753 verified
metadata
library_name: transformers
license: apache-2.0
base_model: google-bert/bert-base-uncased
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: absa-restaurants-bert-base-uncased
    results: []

absa-restaurants-bert-base-uncased

This model is a fine-tuned version of google-bert/bert-base-uncased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7275
  • Accuracy: 0.8790

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 256
  • eval_batch_size: 256
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.2396 1.0 23 0.4017 0.8679
0.1873 2.0 46 0.4208 0.8714
0.1489 3.0 69 0.4428 0.8679
0.1123 4.0 92 0.4690 0.8721
0.0867 5.0 115 0.5119 0.8776
0.0672 6.0 138 0.5222 0.8741
0.0753 7.0 161 0.5781 0.8721
0.0732 8.0 184 0.5416 0.8672
0.0589 9.0 207 0.5674 0.8769
0.0507 10.0 230 0.5819 0.8734
0.0466 11.0 253 0.5871 0.8769
0.0452 12.0 276 0.5818 0.8797
0.0444 13.0 299 0.6002 0.8783
0.0376 14.0 322 0.5886 0.8714
0.0315 15.0 345 0.5846 0.8790
0.0295 16.0 368 0.6067 0.8693
0.0292 17.0 391 0.6186 0.8734
0.0226 18.0 414 0.6222 0.8783
0.0305 19.0 437 0.6400 0.8734
0.0245 20.0 460 0.6544 0.8721
0.0241 21.0 483 0.6438 0.8748
0.0208 22.0 506 0.6512 0.8734
0.0225 23.0 529 0.6580 0.8748
0.0215 24.0 552 0.6401 0.8790
0.019 25.0 575 0.6705 0.8679
0.0197 26.0 598 0.6809 0.8769
0.0192 27.0 621 0.6569 0.8748
0.0182 28.0 644 0.6763 0.8748
0.0173 29.0 667 0.6846 0.8790
0.0151 30.0 690 0.7100 0.8776
0.0171 31.0 713 0.6979 0.8748
0.0185 32.0 736 0.6947 0.8762
0.0161 33.0 759 0.6979 0.8755
0.0169 34.0 782 0.7015 0.8741
0.0158 35.0 805 0.7085 0.8804
0.0144 36.0 828 0.7080 0.8776
0.0138 37.0 851 0.7070 0.8797
0.0145 38.0 874 0.7088 0.8776
0.0154 39.0 897 0.7177 0.8755
0.0159 40.0 920 0.7217 0.8776
0.0154 41.0 943 0.7165 0.8762
0.013 42.0 966 0.7222 0.8783
0.0142 43.0 989 0.7190 0.8776
0.0143 44.0 1012 0.7207 0.8762
0.0123 45.0 1035 0.7256 0.8762
0.013 46.0 1058 0.7291 0.8790
0.0135 47.0 1081 0.7276 0.8797
0.012 48.0 1104 0.7286 0.8797
0.0135 49.0 1127 0.7278 0.8783
0.0139 50.0 1150 0.7275 0.8790

Framework versions

  • Transformers 4.47.0
  • Pytorch 2.5.1+cu121
  • Datasets 3.3.1
  • Tokenizers 0.21.0