xml-roberta-large-ner-qlorafinetune-runs-colab

This model is a fine-tuned version of FacebookAI/xlm-roberta-large on the biobert_json dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0777
  • Precision: 0.9349
  • Recall: 0.9537
  • F1: 0.9442
  • Accuracy: 0.9802

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0004
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Use paged_adamw_8bit with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • training_steps: 2141
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
2.1318 0.0654 20 0.9601 0.5965 0.0370 0.0696 0.7295
0.6887 0.1307 40 0.3004 0.7758 0.7688 0.7723 0.9181
0.3473 0.1961 60 0.2058 0.8130 0.8602 0.8359 0.9424
0.25 0.2614 80 0.1636 0.8543 0.8732 0.8637 0.9519
0.2186 0.3268 100 0.1458 0.8713 0.8847 0.8780 0.9586
0.2276 0.3922 120 0.1288 0.8641 0.9096 0.8863 0.9630
0.1733 0.4575 140 0.1151 0.9050 0.9017 0.9034 0.9640
0.1584 0.5229 160 0.1083 0.8944 0.9414 0.9173 0.9701
0.1506 0.5882 180 0.1213 0.8635 0.9482 0.9039 0.9648
0.1303 0.6536 200 0.0963 0.8953 0.9409 0.9175 0.9699
0.1327 0.7190 220 0.1088 0.8808 0.9144 0.8973 0.9655
0.1416 0.7843 240 0.0903 0.9173 0.9429 0.9299 0.9748
0.1229 0.8497 260 0.0924 0.9197 0.9390 0.9292 0.9740
0.1228 0.9150 280 0.1105 0.8943 0.9463 0.9196 0.9680
0.1338 0.9804 300 0.0840 0.9174 0.9472 0.9320 0.9749
0.1141 1.0458 320 0.0906 0.9121 0.9488 0.9301 0.9744
0.0983 1.1111 340 0.0926 0.9112 0.9570 0.9336 0.9732
0.099 1.1765 360 0.0791 0.9204 0.9508 0.9354 0.9765
0.0947 1.2418 380 0.0852 0.9271 0.9469 0.9369 0.9769
0.1163 1.3072 400 0.0764 0.9231 0.9484 0.9356 0.9765
0.105 1.3725 420 0.0863 0.9062 0.9355 0.9206 0.9732
0.096 1.4379 440 0.0833 0.9282 0.9473 0.9377 0.9772
0.0927 1.5033 460 0.0714 0.9368 0.9520 0.9443 0.9788
0.0944 1.5686 480 0.0730 0.9296 0.9579 0.9435 0.9784
0.0797 1.6340 500 0.0837 0.9180 0.9493 0.9334 0.9756
0.0726 1.6993 520 0.0834 0.9173 0.9660 0.9410 0.9772
0.078 1.7647 540 0.0744 0.9292 0.9435 0.9363 0.9780
0.0902 1.8301 560 0.0793 0.9275 0.9564 0.9417 0.9767
0.0874 1.8954 580 0.0859 0.9180 0.9573 0.9372 0.9752
0.089 1.9608 600 0.0860 0.9146 0.9586 0.9361 0.9745
0.0766 2.0261 620 0.0821 0.9212 0.9569 0.9387 0.9772
0.067 2.0915 640 0.0746 0.9323 0.9583 0.9452 0.9797
0.0535 2.1569 660 0.0771 0.9236 0.9476 0.9354 0.9774
0.0794 2.2222 680 0.0779 0.9315 0.9544 0.9428 0.9782
0.0819 2.2876 700 0.0841 0.9111 0.9443 0.9274 0.9756
0.0642 2.3529 720 0.0671 0.9406 0.9589 0.9497 0.9818
0.0681 2.4183 740 0.0724 0.9354 0.9464 0.9409 0.9789
0.0881 2.4837 760 0.0689 0.9327 0.9575 0.9450 0.9810
0.0706 2.5490 780 0.0813 0.9242 0.9582 0.9409 0.9782
0.0765 2.6144 800 0.0689 0.9365 0.9551 0.9457 0.9797
0.062 2.6797 820 0.0716 0.9434 0.9478 0.9456 0.9804
0.093 2.7451 840 0.0754 0.9282 0.9490 0.9385 0.9783
0.0659 2.8105 860 0.0799 0.9227 0.9524 0.9373 0.9775
0.0806 2.8758 880 0.0775 0.9197 0.9533 0.9362 0.9777
0.0632 2.9412 900 0.0754 0.9229 0.9569 0.9396 0.9789
0.064 3.0065 920 0.0700 0.9387 0.9556 0.9471 0.9813
0.0499 3.0719 940 0.0725 0.9282 0.9573 0.9425 0.9804
0.0525 3.1373 960 0.0852 0.9258 0.9572 0.9412 0.9771
0.0488 3.2026 980 0.0740 0.9298 0.9577 0.9436 0.9792
0.0602 3.2680 1000 0.0785 0.9274 0.9507 0.9389 0.9777
0.0574 3.3333 1020 0.0746 0.9362 0.9537 0.9449 0.9796
0.0583 3.3987 1040 0.0768 0.9272 0.9641 0.9453 0.9798
0.0618 3.4641 1060 0.0774 0.9264 0.9546 0.9403 0.9783
0.0503 3.5294 1080 0.0724 0.9287 0.9484 0.9385 0.9783
0.0529 3.5948 1100 0.0777 0.9349 0.9556 0.9451 0.9787
0.0448 3.6601 1120 0.0686 0.9383 0.9563 0.9472 0.9815
0.0658 3.7255 1140 0.0683 0.9453 0.9576 0.9514 0.9817
0.0591 3.7908 1160 0.0650 0.9407 0.9586 0.9496 0.9822
0.0635 3.8562 1180 0.0781 0.9283 0.9551 0.9415 0.9779
0.063 3.9216 1200 0.0764 0.9330 0.9545 0.9436 0.9783
0.0586 3.9869 1220 0.0706 0.9334 0.9548 0.9440 0.9808
0.0446 4.0523 1240 0.0744 0.9319 0.9556 0.9436 0.9794
0.0373 4.1176 1260 0.0713 0.9351 0.9534 0.9442 0.9802
0.0387 4.1830 1280 0.0752 0.9371 0.9537 0.9453 0.9805
0.0449 4.2484 1300 0.0751 0.9360 0.9536 0.9447 0.9805
0.0415 4.3137 1320 0.0740 0.9419 0.9506 0.9462 0.9814
0.0484 4.3791 1340 0.0692 0.9409 0.9562 0.9485 0.9815
0.0414 4.4444 1360 0.0751 0.9288 0.9555 0.9419 0.9797
0.0346 4.5098 1380 0.0790 0.9267 0.9560 0.9411 0.9796
0.0466 4.5752 1400 0.0840 0.9187 0.9414 0.9299 0.9770
0.0467 4.6405 1420 0.0739 0.9342 0.9579 0.9459 0.9805
0.0401 4.7059 1440 0.0781 0.9293 0.9530 0.9410 0.9786
0.0502 4.7712 1460 0.0768 0.9323 0.9582 0.9451 0.9801
0.0403 4.8366 1480 0.0745 0.9431 0.9564 0.9497 0.9813
0.0471 4.9020 1500 0.0772 0.9316 0.9581 0.9447 0.9796
0.0556 4.9673 1520 0.0749 0.9324 0.9531 0.9426 0.9801
0.0398 5.0327 1540 0.0784 0.9310 0.9534 0.9421 0.9796
0.0422 5.0980 1560 0.0741 0.9386 0.9562 0.9473 0.9812
0.0545 5.1634 1580 0.0721 0.9398 0.9593 0.9495 0.9817
0.0367 5.2288 1600 0.0815 0.9241 0.9526 0.9381 0.9778
0.0333 5.2941 1620 0.0741 0.9381 0.9545 0.9463 0.9805
0.0324 5.3595 1640 0.0755 0.9368 0.9569 0.9468 0.9807
0.0343 5.4248 1660 0.0735 0.9412 0.9536 0.9473 0.9811
0.0405 5.4902 1680 0.0773 0.9344 0.9550 0.9446 0.9803
0.0343 5.5556 1700 0.0723 0.9412 0.9554 0.9482 0.9815
0.0379 5.6209 1720 0.0787 0.9284 0.9513 0.9397 0.9788
0.0346 5.6863 1740 0.0741 0.9405 0.9548 0.9476 0.9808
0.0376 5.7516 1760 0.0794 0.9224 0.9485 0.9353 0.9781
0.0288 5.8170 1780 0.0758 0.9367 0.9594 0.9479 0.9813
0.0394 5.8824 1800 0.0750 0.9394 0.9566 0.9479 0.9810
0.0296 5.9477 1820 0.0736 0.9396 0.9569 0.9482 0.9814
0.0335 6.0131 1840 0.0773 0.9355 0.9549 0.9451 0.9802
0.0297 6.0784 1860 0.0760 0.9361 0.9536 0.9447 0.9803
0.027 6.1438 1880 0.0770 0.9320 0.9528 0.9423 0.9798
0.0247 6.2092 1900 0.0788 0.9318 0.9514 0.9415 0.9795
0.0335 6.2745 1920 0.0770 0.9390 0.9566 0.9477 0.9810
0.0286 6.3399 1940 0.0770 0.9361 0.9558 0.9459 0.9805
0.0256 6.4052 1960 0.0765 0.9351 0.9546 0.9447 0.9802
0.0268 6.4706 1980 0.0773 0.9335 0.9520 0.9426 0.9801
0.0247 6.5359 2000 0.0770 0.9361 0.9550 0.9454 0.9807
0.0299 6.6013 2020 0.0773 0.9373 0.9550 0.9461 0.9807
0.024 6.6667 2040 0.0789 0.9350 0.9525 0.9437 0.9800
0.0278 6.7320 2060 0.0778 0.9367 0.9539 0.9452 0.9804
0.0378 6.7974 2080 0.0766 0.9372 0.9545 0.9458 0.9807
0.0232 6.8627 2100 0.0775 0.9361 0.9538 0.9449 0.9804
0.0259 6.9281 2120 0.0780 0.9353 0.9540 0.9446 0.9802
0.025 6.9935 2140 0.0777 0.9349 0.9537 0.9442 0.9802

Framework versions

  • PEFT 0.13.2
  • Transformers 4.46.3
  • Pytorch 2.5.1+cu121
  • Datasets 3.1.0
  • Tokenizers 0.20.3
Downloads last month
11
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for jamesopeth/xml-roberta-large-ner-qlorafinetune-runs-colab

Adapter
(22)
this model