absa-bert

This model is a fine-tuned version of google-bert/bert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0596
  • Accuracy: 0.8281
  • F1: 0.7632

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 256
  • eval_batch_size: 256
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy F1
1.0575 1.0 12 0.9426 0.5933 0.2483
0.9027 2.0 24 0.8538 0.5952 0.2537
0.8016 3.0 36 0.7493 0.6876 0.4561
0.7066 4.0 48 0.6882 0.7246 0.5049
0.6343 5.0 60 0.6381 0.7394 0.5289
0.5703 6.0 72 0.6112 0.7542 0.6037
0.5229 7.0 84 0.6028 0.7634 0.6459
0.4864 8.0 96 0.6066 0.7634 0.6432
0.4479 9.0 108 0.6055 0.7689 0.6703
0.4154 10.0 120 0.5939 0.7708 0.6736
0.3815 11.0 132 0.6057 0.7708 0.6768
0.3525 12.0 144 0.6229 0.7689 0.6807
0.3229 13.0 156 0.6119 0.7745 0.6874
0.2962 14.0 168 0.6206 0.7874 0.7034
0.2718 15.0 180 0.6303 0.7726 0.6865
0.2448 16.0 192 0.6469 0.7745 0.6858
0.2241 17.0 204 0.6496 0.7745 0.6806
0.1985 18.0 216 0.6656 0.7763 0.6827
0.1784 19.0 228 0.6906 0.7726 0.6771
0.1693 20.0 240 0.6967 0.7856 0.7027
0.148 21.0 252 0.7059 0.7874 0.6945
0.1336 22.0 264 0.7480 0.7837 0.6823
0.1286 23.0 276 0.7424 0.7856 0.6896
0.1166 24.0 288 0.7293 0.7893 0.6970
0.1043 25.0 300 0.7177 0.7985 0.7151
0.0879 26.0 312 0.7184 0.8004 0.7237
0.0872 27.0 324 0.7407 0.7911 0.7121
0.0827 28.0 336 0.7513 0.7967 0.7193
0.0702 29.0 348 0.7797 0.8059 0.7261
0.0618 30.0 360 0.7911 0.8133 0.7378
0.0641 31.0 372 0.7861 0.8115 0.7369
0.051 32.0 384 0.8209 0.8078 0.7249
0.0473 33.0 396 0.7894 0.8133 0.7423
0.0463 34.0 408 0.8047 0.8152 0.7460
0.0447 35.0 420 0.8356 0.8133 0.7389
0.0408 36.0 432 0.8416 0.8078 0.7330
0.0393 37.0 444 0.8392 0.8133 0.7377
0.0356 38.0 456 0.8373 0.8207 0.7511
0.0387 39.0 468 0.8539 0.8096 0.7363
0.0347 40.0 480 0.8908 0.8078 0.7258
0.032 41.0 492 0.8500 0.8170 0.7474
0.0336 42.0 504 0.8515 0.8133 0.7423
0.0336 43.0 516 0.8515 0.8152 0.7407
0.0273 44.0 528 0.8832 0.8244 0.7497
0.0283 45.0 540 0.8701 0.8115 0.7398
0.0242 46.0 552 0.8904 0.8189 0.7487
0.027 47.0 564 0.8972 0.8244 0.7518
0.0275 48.0 576 0.8824 0.8189 0.7529
0.0197 49.0 588 0.9016 0.8152 0.7433
0.0205 50.0 600 0.9236 0.8152 0.7446
0.0235 51.0 612 0.9441 0.8189 0.7473
0.0215 52.0 624 0.9369 0.8207 0.7492
0.0195 53.0 636 0.9186 0.8244 0.7567
0.0155 54.0 648 0.9427 0.8244 0.7564
0.0184 55.0 660 0.9445 0.8226 0.7540
0.0159 56.0 672 0.9560 0.8189 0.7491
0.0172 57.0 684 0.9785 0.8152 0.7386
0.0151 58.0 696 0.9750 0.8152 0.7442
0.0159 59.0 708 0.9631 0.8207 0.7516
0.0126 60.0 720 1.0032 0.8189 0.7471
0.0153 61.0 732 0.9948 0.8170 0.7458
0.018 62.0 744 0.9880 0.8170 0.7494
0.0173 63.0 756 0.9989 0.8207 0.7523
0.0161 64.0 768 1.0125 0.8096 0.7313
0.0165 65.0 780 1.0105 0.8189 0.7474
0.0125 66.0 792 1.0118 0.8133 0.7374
0.0139 67.0 804 0.9968 0.8189 0.7514
0.0112 68.0 816 1.0152 0.8170 0.7442
0.0168 69.0 828 1.0176 0.8189 0.7463
0.0127 70.0 840 1.0108 0.8133 0.7404
0.0124 71.0 852 1.0187 0.8207 0.7487
0.0129 72.0 864 1.0138 0.8207 0.7531
0.0087 73.0 876 1.0301 0.8207 0.7481
0.0112 74.0 888 1.0131 0.8281 0.7629
0.0122 75.0 900 1.0176 0.8299 0.7658
0.0086 76.0 912 1.0301 0.8226 0.7543
0.0103 77.0 924 1.0421 0.8207 0.7527
0.0102 78.0 936 1.0410 0.8189 0.7480
0.0114 79.0 948 1.0451 0.8226 0.7539
0.0102 80.0 960 1.0286 0.8281 0.7632
0.0117 81.0 972 1.0311 0.8262 0.7603
0.0117 82.0 984 1.0386 0.8262 0.7589
0.0136 83.0 996 1.0320 0.8262 0.7603
0.01 84.0 1008 1.0304 0.8299 0.7679
0.0103 85.0 1020 1.0362 0.8262 0.7617
0.0105 86.0 1032 1.0440 0.8262 0.7617
0.0099 87.0 1044 1.0555 0.8244 0.7581
0.0117 88.0 1056 1.0615 0.8244 0.7571
0.0098 89.0 1068 1.0553 0.8262 0.7601
0.0105 90.0 1080 1.0586 0.8262 0.7603
0.0106 91.0 1092 1.0583 0.8262 0.7603
0.01 92.0 1104 1.0558 0.8262 0.7603
0.0105 93.0 1116 1.0589 0.8262 0.7603
0.011 94.0 1128 1.0609 0.8281 0.7632
0.0093 95.0 1140 1.0614 0.8281 0.7632
0.0096 96.0 1152 1.0615 0.8281 0.7632
0.0118 97.0 1164 1.0630 0.8281 0.7632
0.0089 98.0 1176 1.0618 0.8281 0.7632
0.0122 99.0 1188 1.0601 0.8281 0.7632
0.0101 100.0 1200 1.0596 0.8281 0.7632

Framework versions

  • Transformers 4.51.1
  • Pytorch 2.5.1+cu124
  • Datasets 3.5.0
  • Tokenizers 0.21.0
Downloads last month
3
Safetensors
Model size
109M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for hai2131/absa-bert

Finetuned
(5436)
this model

Space using hai2131/absa-bert 1