GS_bert6

This model is a fine-tuned version of biblo0507/GS_bert5 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0767
  • F1: 0.7566
  • Precision: 0.7889
  • Recall: 0.7324

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 350

Training results

Training Loss Epoch Step Validation Loss F1 Precision Recall
0.6976 1.0 45 0.6652 0.0362 0.0370 0.0356
0.5367 2.0 90 0.4968 0.0304 0.0315 0.0296
0.3908 3.0 135 0.3498 0.0320 0.0333 0.0310
0.2893 4.0 180 0.2675 0.0354 0.0370 0.0343
0.2385 5.0 225 0.2163 0.0392 0.0407 0.0380
0.2016 6.0 270 0.1877 0.0460 0.0481 0.0444
0.1847 7.0 315 0.1733 0.0336 0.0352 0.0324
0.1751 8.0 360 0.1663 0.0336 0.0352 0.0324
0.1696 9.0 405 0.1631 0.0360 0.0370 0.0352
0.1695 10.0 450 0.1614 0.0484 0.05 0.0472
0.1666 11.0 495 0.1597 0.0939 0.0981 0.0907
0.1626 12.0 540 0.1565 0.1542 0.1630 0.1477
0.157 13.0 585 0.1519 0.2873 0.3019 0.2764
0.151 14.0 630 0.1463 0.3532 0.3685 0.3417
0.1451 15.0 675 0.1407 0.4304 0.45 0.4157
0.1394 16.0 720 0.1367 0.4529 0.4722 0.4384
0.134 17.0 765 0.1317 0.5127 0.5352 0.4958
0.1273 18.0 810 0.1275 0.5534 0.5778 0.5352
0.1212 19.0 855 0.1245 0.5770 0.6019 0.5583
0.1167 20.0 900 0.1207 0.5992 0.6259 0.5792
0.1145 21.0 945 0.1177 0.6153 0.6426 0.5949
0.1089 22.0 990 0.1146 0.6259 0.6537 0.6051
0.1045 23.0 1035 0.1118 0.6378 0.6667 0.6162
0.1023 24.0 1080 0.1088 0.6614 0.6907 0.6394
0.0963 25.0 1125 0.1059 0.6770 0.7074 0.6542
0.0923 26.0 1170 0.1041 0.6630 0.6926 0.6407
0.0874 27.0 1215 0.1017 0.6725 0.7019 0.6505
0.0852 28.0 1260 0.0999 0.6794 0.7093 0.6569
0.0838 29.0 1305 0.0973 0.6905 0.7204 0.6681
0.0788 30.0 1350 0.0955 0.7005 0.7315 0.6773
0.0753 31.0 1395 0.0935 0.7098 0.7407 0.6866
0.0739 32.0 1440 0.0921 0.6979 0.7278 0.6755
0.0701 33.0 1485 0.0905 0.7146 0.7463 0.6907
0.0692 34.0 1530 0.0889 0.7188 0.75 0.6954
0.0635 35.0 1575 0.0872 0.7257 0.7574 0.7019
0.0625 36.0 1620 0.0861 0.7135 0.7444 0.6903
0.0618 37.0 1665 0.0848 0.7267 0.7593 0.7023
0.0606 38.0 1710 0.0833 0.7257 0.7574 0.7019
0.056 39.0 1755 0.0825 0.7230 0.7556 0.6986
0.0544 40.0 1800 0.0813 0.7312 0.7630 0.7074
0.0541 41.0 1845 0.0803 0.7381 0.7704 0.7139
0.0514 42.0 1890 0.0794 0.7249 0.7574 0.7005
0.05 43.0 1935 0.0782 0.7304 0.7611 0.7074
0.0488 44.0 1980 0.0769 0.7384 0.7704 0.7144
0.0455 45.0 2025 0.0763 0.7357 0.7667 0.7125
0.0455 46.0 2070 0.0753 0.7455 0.7778 0.7213
0.0438 47.0 2115 0.0749 0.7405 0.7722 0.7167
0.0418 48.0 2160 0.0739 0.7476 0.7796 0.7236
0.0412 49.0 2205 0.0735 0.7442 0.7759 0.7204
0.0404 50.0 2250 0.0720 0.7423 0.7741 0.7185
0.0395 51.0 2295 0.0723 0.7455 0.7778 0.7213
0.0376 52.0 2340 0.0707 0.7405 0.7722 0.7167
0.0368 53.0 2385 0.0706 0.7460 0.7778 0.7222
0.0347 54.0 2430 0.0695 0.7553 0.7870 0.7315
0.0337 55.0 2475 0.0697 0.7489 0.7815 0.7245
0.0345 56.0 2520 0.0689 0.7595 0.7926 0.7347
0.0334 57.0 2565 0.0684 0.7532 0.7852 0.7292
0.0319 58.0 2610 0.0687 0.7508 0.7833 0.7264
0.0316 59.0 2655 0.0674 0.7516 0.7833 0.7278
0.0297 60.0 2700 0.0672 0.7487 0.7815 0.7241
0.0292 61.0 2745 0.0670 0.7508 0.7833 0.7264
0.028 62.0 2790 0.0660 0.7471 0.7796 0.7227
0.0279 63.0 2835 0.0659 0.7489 0.7815 0.7245
0.0271 64.0 2880 0.0658 0.7426 0.7741 0.7190
0.0251 65.0 2925 0.0652 0.7458 0.7778 0.7218
0.0258 66.0 2970 0.0657 0.7479 0.7796 0.7241
0.0254 67.0 3015 0.0642 0.7481 0.7796 0.7245
0.024 68.0 3060 0.0648 0.7437 0.7759 0.7194
0.0239 69.0 3105 0.0634 0.7511 0.7833 0.7269
0.0225 70.0 3150 0.0636 0.7548 0.7870 0.7306
0.0225 71.0 3195 0.0640 0.7431 0.7741 0.7199
0.0222 72.0 3240 0.0643 0.7389 0.7704 0.7153
0.021 73.0 3285 0.0653 0.7489 0.7815 0.7245
0.0209 74.0 3330 0.0638 0.7635 0.7963 0.7389
0.021 75.0 3375 0.0632 0.7598 0.7926 0.7352
0.0198 76.0 3420 0.0632 0.7548 0.7870 0.7306
0.0196 77.0 3465 0.0623 0.7598 0.7926 0.7352
0.0192 78.0 3510 0.0627 0.7563 0.7889 0.7319
0.0188 79.0 3555 0.0629 0.7582 0.7907 0.7338
0.0185 80.0 3600 0.0629 0.7434 0.7741 0.7204
0.0178 81.0 3645 0.0615 0.7624 0.7944 0.7384
0.0173 82.0 3690 0.0622 0.7653 0.7981 0.7407
0.0165 83.0 3735 0.0614 0.7563 0.7889 0.7319
0.0173 84.0 3780 0.0631 0.7513 0.7833 0.7273
0.0165 85.0 3825 0.0621 0.7566 0.7889 0.7324
0.0157 86.0 3870 0.0628 0.7537 0.7852 0.7301
0.015 87.0 3915 0.0618 0.7563 0.7889 0.7319
0.0145 88.0 3960 0.0612 0.7606 0.7926 0.7366
0.0145 89.0 4005 0.0624 0.7603 0.7926 0.7361
0.0144 90.0 4050 0.0615 0.7545 0.7870 0.7301
0.0137 91.0 4095 0.0618 0.7563 0.7889 0.7319
0.0145 92.0 4140 0.0605 0.7545 0.7870 0.7301
0.0132 93.0 4185 0.0614 0.7513 0.7833 0.7273
0.0133 94.0 4230 0.0620 0.7566 0.7889 0.7324
0.013 95.0 4275 0.0620 0.7511 0.7833 0.7269
0.0131 96.0 4320 0.0598 0.7638 0.7963 0.7394
0.0125 97.0 4365 0.0607 0.7563 0.7889 0.7319
0.0122 98.0 4410 0.0619 0.7616 0.7944 0.7370
0.012 99.0 4455 0.0619 0.7595 0.7926 0.7347
0.0115 100.0 4500 0.0615 0.7563 0.7889 0.7319
0.0117 101.0 4545 0.0605 0.7651 0.7981 0.7403
0.0111 102.0 4590 0.0607 0.7611 0.7944 0.7361
0.0106 103.0 4635 0.0609 0.7619 0.7944 0.7375
0.0108 104.0 4680 0.0619 0.7511 0.7833 0.7269
0.0107 105.0 4725 0.0611 0.7611 0.7944 0.7361
0.0101 106.0 4770 0.0613 0.7561 0.7889 0.7315
0.0101 107.0 4815 0.0614 0.7582 0.7907 0.7338
0.0102 108.0 4860 0.0606 0.7601 0.7926 0.7356
0.0099 109.0 4905 0.0604 0.7601 0.7926 0.7356
0.0097 110.0 4950 0.0609 0.7524 0.7852 0.7278
0.0095 111.0 4995 0.0614 0.7616 0.7944 0.7370
0.0094 112.0 5040 0.0605 0.7582 0.7907 0.7338
0.0092 113.0 5085 0.0612 0.7532 0.7852 0.7292
0.0093 114.0 5130 0.0604 0.7632 0.7963 0.7384
0.0086 115.0 5175 0.0614 0.7601 0.7926 0.7356
0.0082 116.0 5220 0.0616 0.7569 0.7889 0.7329
0.0084 117.0 5265 0.0617 0.7582 0.7907 0.7338
0.0083 118.0 5310 0.0605 0.7616 0.7944 0.7370
0.0081 119.0 5355 0.0612 0.7632 0.7963 0.7384
0.008 120.0 5400 0.0613 0.7622 0.7944 0.7380
0.0077 121.0 5445 0.0608 0.7667 0.8 0.7417
0.0075 122.0 5490 0.0620 0.7497 0.7815 0.7259
0.0075 123.0 5535 0.0607 0.7582 0.7907 0.7338
0.0072 124.0 5580 0.0609 0.7656 0.7981 0.7412
0.0074 125.0 5625 0.0617 0.7585 0.7907 0.7343
0.0071 126.0 5670 0.0615 0.7688 0.8019 0.7440
0.0072 127.0 5715 0.0608 0.7556 0.7870 0.7319
0.0068 128.0 5760 0.0615 0.7651 0.7981 0.7403
0.0068 129.0 5805 0.0614 0.7545 0.7870 0.7301
0.0069 130.0 5850 0.0616 0.7693 0.8019 0.7449
0.0064 131.0 5895 0.0626 0.7656 0.7981 0.7412
0.0063 132.0 5940 0.0626 0.7616 0.7944 0.7370
0.0064 133.0 5985 0.0623 0.7603 0.7926 0.7361
0.0062 134.0 6030 0.0614 0.7579 0.7907 0.7333
0.006 135.0 6075 0.0617 0.7622 0.7944 0.7380
0.0062 136.0 6120 0.0628 0.7653 0.7981 0.7407
0.0059 137.0 6165 0.0621 0.7669 0.8 0.7421
0.0054 138.0 6210 0.0619 0.7616 0.7944 0.7370
0.0055 139.0 6255 0.0626 0.7603 0.7926 0.7361
0.0055 140.0 6300 0.0626 0.7648 0.7981 0.7398
0.0053 141.0 6345 0.0629 0.7685 0.8019 0.7435
0.0054 142.0 6390 0.0622 0.7635 0.7963 0.7389
0.0052 143.0 6435 0.0609 0.7672 0.8000 0.7426
0.0053 144.0 6480 0.0634 0.7616 0.7944 0.7370
0.0052 145.0 6525 0.0630 0.7550 0.7870 0.7310
0.005 146.0 6570 0.0637 0.7569 0.7889 0.7329
0.0052 147.0 6615 0.0630 0.7616 0.7944 0.7370
0.0049 148.0 6660 0.0617 0.7667 0.8 0.7417
0.0049 149.0 6705 0.0636 0.7598 0.7926 0.7352
0.0044 150.0 6750 0.0631 0.7582 0.7907 0.7338
0.0049 151.0 6795 0.0635 0.7651 0.7981 0.7403
0.0045 152.0 6840 0.0614 0.7669 0.8 0.7421
0.0044 153.0 6885 0.0632 0.7675 0.8 0.7431
0.0046 154.0 6930 0.0638 0.7619 0.7944 0.7375
0.0044 155.0 6975 0.0630 0.7616 0.7944 0.7370
0.0042 156.0 7020 0.0622 0.7675 0.8 0.7431
0.0041 157.0 7065 0.0629 0.7601 0.7926 0.7356
0.0043 158.0 7110 0.0635 0.7632 0.7963 0.7384
0.0043 159.0 7155 0.0633 0.7601 0.7926 0.7356
0.004 160.0 7200 0.0625 0.7640 0.7963 0.7398
0.0041 161.0 7245 0.0631 0.7632 0.7963 0.7384
0.0039 162.0 7290 0.0638 0.7635 0.7963 0.7389
0.004 163.0 7335 0.0641 0.7619 0.7944 0.7375
0.0039 164.0 7380 0.0633 0.7685 0.8019 0.7435
0.0037 165.0 7425 0.0647 0.7563 0.7889 0.7319
0.0038 166.0 7470 0.0636 0.7672 0.8 0.7426
0.0038 167.0 7515 0.0646 0.7688 0.8019 0.7440
0.0036 168.0 7560 0.0643 0.7619 0.7944 0.7375
0.0035 169.0 7605 0.0643 0.7582 0.7907 0.7338
0.0034 170.0 7650 0.0648 0.7667 0.8 0.7417
0.0034 171.0 7695 0.0638 0.7603 0.7926 0.7361
0.0034 172.0 7740 0.0642 0.7685 0.8019 0.7435
0.0034 173.0 7785 0.0642 0.7635 0.7963 0.7389
0.0034 174.0 7830 0.0643 0.7653 0.7981 0.7407
0.0034 175.0 7875 0.0648 0.7619 0.7944 0.7375
0.0033 176.0 7920 0.0651 0.7683 0.8019 0.7431
0.0032 177.0 7965 0.0641 0.7651 0.7981 0.7403
0.0031 178.0 8010 0.0656 0.7601 0.7926 0.7356
0.0031 179.0 8055 0.0650 0.7585 0.7907 0.7343
0.003 180.0 8100 0.0675 0.7622 0.7944 0.7380
0.0029 181.0 8145 0.0661 0.7601 0.7926 0.7356
0.0029 182.0 8190 0.0655 0.7653 0.7981 0.7407
0.0029 183.0 8235 0.0662 0.7619 0.7944 0.7375
0.0029 184.0 8280 0.0658 0.7619 0.7944 0.7375
0.0027 185.0 8325 0.0658 0.7601 0.7926 0.7356
0.0028 186.0 8370 0.0660 0.7648 0.7981 0.7398
0.0027 187.0 8415 0.0666 0.7669 0.8 0.7421
0.0027 188.0 8460 0.0655 0.7685 0.8019 0.7435
0.0027 189.0 8505 0.0663 0.7651 0.7981 0.7403
0.0028 190.0 8550 0.0668 0.7653 0.7981 0.7407
0.0026 191.0 8595 0.0666 0.7653 0.7981 0.7407
0.0026 192.0 8640 0.0655 0.7672 0.8 0.7426
0.0027 193.0 8685 0.0676 0.7616 0.7944 0.7370
0.0026 194.0 8730 0.0668 0.7651 0.7981 0.7403
0.0024 195.0 8775 0.0669 0.7635 0.7963 0.7389
0.0024 196.0 8820 0.0666 0.7616 0.7944 0.7370
0.0024 197.0 8865 0.0669 0.7653 0.7981 0.7407
0.0024 198.0 8910 0.0680 0.7619 0.7944 0.7375
0.0023 199.0 8955 0.0662 0.7688 0.8019 0.7440
0.0023 200.0 9000 0.0670 0.7669 0.8 0.7421
0.0022 201.0 9045 0.0673 0.7656 0.7981 0.7412
0.0022 202.0 9090 0.0672 0.7653 0.7981 0.7407
0.0022 203.0 9135 0.0673 0.7651 0.7981 0.7403
0.0023 204.0 9180 0.0677 0.7653 0.7981 0.7407
0.0021 205.0 9225 0.0682 0.7616 0.7944 0.7370
0.0021 206.0 9270 0.0683 0.7635 0.7963 0.7389
0.002 207.0 9315 0.0678 0.7635 0.7963 0.7389
0.0021 208.0 9360 0.0685 0.7653 0.7981 0.7407
0.002 209.0 9405 0.0673 0.7601 0.7926 0.7356
0.0021 210.0 9450 0.0679 0.7638 0.7963 0.7394
0.002 211.0 9495 0.0682 0.7653 0.7981 0.7407
0.002 212.0 9540 0.0676 0.7653 0.7981 0.7407
0.002 213.0 9585 0.0687 0.7653 0.7981 0.7407
0.002 214.0 9630 0.0685 0.7619 0.7944 0.7375
0.0019 215.0 9675 0.0682 0.7632 0.7963 0.7384
0.0019 216.0 9720 0.0684 0.7601 0.7926 0.7356
0.0019 217.0 9765 0.0689 0.7563 0.7889 0.7319
0.002 218.0 9810 0.0699 0.7585 0.7907 0.7343
0.0019 219.0 9855 0.0699 0.7616 0.7944 0.7370
0.002 220.0 9900 0.0680 0.7706 0.8037 0.7458
0.0018 221.0 9945 0.0687 0.7616 0.7944 0.7370
0.0024 222.0 9990 0.0716 0.7542 0.7870 0.7296
0.0025 223.0 10035 0.0776 0.7458 0.7778 0.7218
0.0029 224.0 10080 0.0735 0.7476 0.7796 0.7236
0.0028 225.0 10125 0.0720 0.7553 0.7870 0.7315
0.0019 226.0 10170 0.0708 0.7532 0.7852 0.7292
0.0018 227.0 10215 0.0696 0.7622 0.7944 0.7380
0.0017 228.0 10260 0.0706 0.7606 0.7926 0.7366
0.0016 229.0 10305 0.0706 0.7624 0.7944 0.7384
0.0021 230.0 10350 0.0706 0.7616 0.7944 0.7370
0.0015 231.0 10395 0.0704 0.7688 0.8019 0.7440
0.0015 232.0 10440 0.0701 0.7672 0.8 0.7426
0.0016 233.0 10485 0.0710 0.7622 0.7944 0.7380
0.0015 234.0 10530 0.0710 0.7601 0.7926 0.7356
0.0015 235.0 10575 0.0708 0.7638 0.7963 0.7394
0.0015 236.0 10620 0.0712 0.7635 0.7963 0.7389
0.0016 237.0 10665 0.0710 0.7603 0.7926 0.7361
0.0014 238.0 10710 0.0719 0.7622 0.7944 0.7380
0.0014 239.0 10755 0.0718 0.7585 0.7907 0.7343
0.0015 240.0 10800 0.0712 0.7638 0.7963 0.7394
0.0014 241.0 10845 0.0720 0.7606 0.7926 0.7366
0.0014 242.0 10890 0.0716 0.7640 0.7963 0.7398
0.0014 243.0 10935 0.0717 0.7622 0.7944 0.7380
0.0015 244.0 10980 0.0720 0.7603 0.7926 0.7361
0.0015 245.0 11025 0.0726 0.7619 0.7944 0.7375
0.0013 246.0 11070 0.0721 0.7603 0.7926 0.7361
0.0013 247.0 11115 0.0722 0.7606 0.7926 0.7366
0.0013 248.0 11160 0.0720 0.7635 0.7963 0.7389
0.0013 249.0 11205 0.0721 0.7619 0.7944 0.7375
0.0012 250.0 11250 0.0730 0.7638 0.7963 0.7394
0.0013 251.0 11295 0.0722 0.7638 0.7963 0.7394
0.0013 252.0 11340 0.0727 0.7603 0.7926 0.7361
0.0013 253.0 11385 0.0726 0.7603 0.7926 0.7361
0.0012 254.0 11430 0.0727 0.7603 0.7926 0.7361
0.0012 255.0 11475 0.0726 0.7601 0.7926 0.7356
0.0012 256.0 11520 0.0727 0.7622 0.7944 0.7380
0.0012 257.0 11565 0.0727 0.7603 0.7926 0.7361
0.0013 258.0 11610 0.0727 0.7585 0.7907 0.7343
0.0013 259.0 11655 0.0728 0.7585 0.7907 0.7343
0.0012 260.0 11700 0.0733 0.7587 0.7907 0.7347
0.0011 261.0 11745 0.0730 0.7603 0.7926 0.7361
0.0012 262.0 11790 0.0735 0.7587 0.7907 0.7347
0.0012 263.0 11835 0.0732 0.7550 0.7870 0.7310
0.0011 264.0 11880 0.0731 0.7603 0.7926 0.7361
0.0011 265.0 11925 0.0735 0.7566 0.7889 0.7324
0.0011 266.0 11970 0.0734 0.7585 0.7907 0.7343
0.0011 267.0 12015 0.0737 0.7603 0.7926 0.7361
0.0011 268.0 12060 0.0743 0.7566 0.7889 0.7324
0.0011 269.0 12105 0.0738 0.7566 0.7889 0.7324
0.0011 270.0 12150 0.0733 0.7566 0.7889 0.7324
0.001 271.0 12195 0.0733 0.7569 0.7889 0.7329
0.001 272.0 12240 0.0734 0.7619 0.7944 0.7375
0.0011 273.0 12285 0.0735 0.7603 0.7926 0.7361
0.0011 274.0 12330 0.0735 0.7582 0.7907 0.7338
0.0011 275.0 12375 0.0736 0.7585 0.7907 0.7343
0.001 276.0 12420 0.0737 0.7603 0.7926 0.7361
0.0011 277.0 12465 0.0728 0.7603 0.7926 0.7361
0.001 278.0 12510 0.0730 0.7603 0.7926 0.7361
0.001 279.0 12555 0.0734 0.7638 0.7963 0.7394
0.001 280.0 12600 0.0743 0.7622 0.7944 0.7380
0.001 281.0 12645 0.0738 0.7638 0.7963 0.7394
0.001 282.0 12690 0.0740 0.7638 0.7963 0.7394
0.001 283.0 12735 0.0740 0.7603 0.7926 0.7361
0.001 284.0 12780 0.0740 0.7603 0.7926 0.7361
0.001 285.0 12825 0.0743 0.7603 0.7926 0.7361
0.001 286.0 12870 0.0743 0.7622 0.7944 0.7380
0.0009 287.0 12915 0.0744 0.7587 0.7907 0.7347
0.0009 288.0 12960 0.0749 0.7569 0.7889 0.7329
0.001 289.0 13005 0.0750 0.7606 0.7926 0.7366
0.001 290.0 13050 0.0747 0.7585 0.7907 0.7343
0.001 291.0 13095 0.0737 0.7656 0.7981 0.7412
0.0009 292.0 13140 0.0745 0.7587 0.7907 0.7347
0.0009 293.0 13185 0.0744 0.7585 0.7907 0.7343
0.0009 294.0 13230 0.0746 0.7619 0.7944 0.7375
0.0009 295.0 13275 0.0748 0.7616 0.7944 0.7370
0.0009 296.0 13320 0.0742 0.7582 0.7907 0.7338
0.0009 297.0 13365 0.0749 0.7616 0.7944 0.7370
0.0009 298.0 13410 0.0754 0.7566 0.7889 0.7324
0.0008 299.0 13455 0.0747 0.7601 0.7926 0.7356
0.0009 300.0 13500 0.0745 0.7585 0.7907 0.7343
0.0009 301.0 13545 0.0750 0.7585 0.7907 0.7343
0.0009 302.0 13590 0.0751 0.7585 0.7907 0.7343
0.0009 303.0 13635 0.0758 0.7566 0.7889 0.7324
0.0009 304.0 13680 0.0752 0.7566 0.7889 0.7324
0.0009 305.0 13725 0.0747 0.7550 0.7870 0.7310
0.0009 306.0 13770 0.0754 0.7582 0.7907 0.7338
0.001 307.0 13815 0.0747 0.7582 0.7907 0.7338
0.0009 308.0 13860 0.0757 0.7582 0.7907 0.7338
0.0009 309.0 13905 0.0752 0.7601 0.7926 0.7356
0.0008 310.0 13950 0.0753 0.7582 0.7907 0.7338
0.0008 311.0 13995 0.0756 0.7603 0.7926 0.7361
0.0008 312.0 14040 0.0757 0.7569 0.7889 0.7329
0.0008 313.0 14085 0.0761 0.7616 0.7944 0.7370
0.0009 314.0 14130 0.0761 0.7582 0.7907 0.7338
0.0008 315.0 14175 0.0758 0.7582 0.7907 0.7338
0.0008 316.0 14220 0.0756 0.7601 0.7926 0.7356
0.0008 317.0 14265 0.0759 0.7656 0.7981 0.7412
0.0008 318.0 14310 0.0762 0.7619 0.7944 0.7375
0.0008 319.0 14355 0.0763 0.7566 0.7889 0.7324
0.0008 320.0 14400 0.0762 0.7550 0.7870 0.7310
0.0008 321.0 14445 0.0765 0.7550 0.7870 0.7310
0.0008 322.0 14490 0.0759 0.7550 0.7870 0.7310
0.0008 323.0 14535 0.0764 0.7566 0.7889 0.7324
0.0008 324.0 14580 0.0764 0.7534 0.7852 0.7296
0.0008 325.0 14625 0.0760 0.7550 0.7870 0.7310
0.0008 326.0 14670 0.0761 0.7566 0.7889 0.7324
0.0008 327.0 14715 0.0762 0.7566 0.7889 0.7324
0.0008 328.0 14760 0.0763 0.7550 0.7870 0.7310
0.0008 329.0 14805 0.0761 0.7601 0.7926 0.7356
0.0007 330.0 14850 0.0764 0.7585 0.7907 0.7343
0.0008 331.0 14895 0.0763 0.7585 0.7907 0.7343
0.0007 332.0 14940 0.0763 0.7582 0.7907 0.7338
0.0008 333.0 14985 0.0763 0.7566 0.7889 0.7324
0.0008 334.0 15030 0.0766 0.7566 0.7889 0.7324
0.0008 335.0 15075 0.0766 0.7566 0.7889 0.7324
0.0008 336.0 15120 0.0766 0.7566 0.7889 0.7324
0.0008 337.0 15165 0.0763 0.7550 0.7870 0.7310
0.0007 338.0 15210 0.0766 0.7550 0.7870 0.7310
0.0008 339.0 15255 0.0767 0.7550 0.7870 0.7310
0.0008 340.0 15300 0.0765 0.7566 0.7889 0.7324
0.0008 341.0 15345 0.0767 0.7550 0.7870 0.7310
0.0008 342.0 15390 0.0767 0.7566 0.7889 0.7324
0.0007 343.0 15435 0.0768 0.7550 0.7870 0.7310
0.0007 344.0 15480 0.0767 0.7550 0.7870 0.7310
0.0007 345.0 15525 0.0767 0.7566 0.7889 0.7324
0.0007 346.0 15570 0.0766 0.7566 0.7889 0.7324
0.0008 347.0 15615 0.0766 0.7566 0.7889 0.7324
0.0007 348.0 15660 0.0767 0.7566 0.7889 0.7324
0.0007 349.0 15705 0.0767 0.7566 0.7889 0.7324
0.0007 350.0 15750 0.0767 0.7566 0.7889 0.7324

Framework versions

  • Transformers 4.51.0.dev0
  • Pytorch 2.5.1+cu121
  • Datasets 3.4.1
  • Tokenizers 0.21.0
Downloads last month
2
Safetensors
Model size
92.3M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for biblo0507/GS_bert6

Base model

skt/kobert-base-v1
Finetuned
biblo0507/GS_bert2
Finetuned
biblo0507/GS_bert3
Finetuned
biblo0507/GS_bert4
Finetuned
biblo0507/GS_bert5
Finetuned
(1)
this model
Finetunes
1 model