|
--- |
|
license: apache-2.0 |
|
base_model: allenai/longformer-base-4096 |
|
tags: |
|
- generated_from_trainer |
|
datasets: |
|
- essays_su_g |
|
metrics: |
|
- accuracy |
|
model-index: |
|
- name: longformer-simple |
|
results: |
|
- task: |
|
name: Token Classification |
|
type: token-classification |
|
dataset: |
|
name: essays_su_g |
|
type: essays_su_g |
|
config: simple |
|
split: train[80%:100%] |
|
args: simple |
|
metrics: |
|
- name: Accuracy |
|
type: accuracy |
|
value: 0.8449255946993012 |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# longformer-simple |
|
|
|
This model is a fine-tuned version of [allenai/longformer-base-4096](https://huggingface.co/allenai/longformer-base-4096) on the essays_su_g dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 0.6609 |
|
- Claim: {'precision': 0.6078710289236605, 'recall': 0.6151631477927063, 'f1-score': 0.6114953493918436, 'support': 4168.0} |
|
- Majorclaim: {'precision': 0.782967032967033, 'recall': 0.7946096654275093, 'f1-score': 0.7887453874538746, 'support': 2152.0} |
|
- O: {'precision': 0.934072084172823, 'recall': 0.9045089963147627, 'f1-score': 0.9190528634361235, 'support': 9226.0} |
|
- Premise: {'precision': 0.8725067166001791, 'recall': 0.8876832601673155, 'f1-score': 0.8800295615043522, 'support': 12073.0} |
|
- Accuracy: 0.8449 |
|
- Macro avg: {'precision': 0.7993542156659239, 'recall': 0.8004912674255735, 'f1-score': 0.7998307904465485, 'support': 27619.0} |
|
- Weighted avg: {'precision': 0.8461593157460915, 'recall': 0.8449255946993012, 'f1-score': 0.8454278324403368, 'support': 27619.0} |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 2e-05 |
|
- train_batch_size: 8 |
|
- eval_batch_size: 8 |
|
- seed: 42 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- num_epochs: 16 |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Claim | Majorclaim | O | Premise | Accuracy | Macro avg | Weighted avg | |
|
|:-------------:|:-----:|:----:|:---------------:|:---------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:--------:|:-------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:| |
|
| No log | 1.0 | 41 | 0.5690 | {'precision': 0.49395770392749244, 'recall': 0.23536468330134358, 'f1-score': 0.31881702957426067, 'support': 4168.0} | {'precision': 0.5330313325783315, 'recall': 0.6561338289962825, 'f1-score': 0.5882107894188711, 'support': 2152.0} | {'precision': 0.9200096957944491, 'recall': 0.82278343810969, 'f1-score': 0.8686845568461407, 'support': 9226.0} | {'precision': 0.777574153261386, 'recall': 0.9488942267870455, 'f1-score': 0.8547340147728121, 'support': 12073.0} | 0.7763 | {'precision': 0.6811432213904147, 'recall': 0.6657940442985903, 'f1-score': 0.6576115976530211, 'support': 27619.0} | {'precision': 0.7632992267425562, 'recall': 0.7762772004779318, 'f1-score': 0.7577517824653167, 'support': 27619.0} | |
|
| No log | 2.0 | 82 | 0.4430 | {'precision': 0.6068347710683477, 'recall': 0.43881957773512476, 'f1-score': 0.5093288777499304, 'support': 4168.0} | {'precision': 0.6947840260798696, 'recall': 0.7922862453531598, 'f1-score': 0.7403386886669561, 'support': 2152.0} | {'precision': 0.930324074074074, 'recall': 0.8712334706264904, 'f1-score': 0.8998096943915818, 'support': 9226.0} | {'precision': 0.8270298275479239, 'recall': 0.9255363207156465, 'f1-score': 0.8735146966854284, 'support': 12073.0} | 0.8236 | {'precision': 0.7647431746925538, 'recall': 0.7569689036076054, 'f1-score': 0.7557479893734742, 'support': 27619.0} | {'precision': 0.8180007808150275, 'recall': 0.823563488902567, 'f1-score': 0.8169621924766614, 'support': 27619.0} | |
|
| No log | 3.0 | 123 | 0.4280 | {'precision': 0.5555102040816327, 'recall': 0.6530710172744721, 'f1-score': 0.6003528892809882, 'support': 4168.0} | {'precision': 0.7618816682832201, 'recall': 0.7300185873605948, 'f1-score': 0.7456098718557191, 'support': 2152.0} | {'precision': 0.9472815190470575, 'recall': 0.8705831346195534, 'f1-score': 0.9073143179892686, 'support': 9226.0} | {'precision': 0.8730497618656594, 'recall': 0.8806427565642343, 'f1-score': 0.8768298214506619, 'support': 12073.0} | 0.8312 | {'precision': 0.7844307883193924, 'recall': 0.7835788739547136, 'f1-score': 0.7825267251441594, 'support': 27619.0} | {'precision': 0.8412645262496828, 'recall': 0.8312031572468228, 'f1-score': 0.8350654121763821, 'support': 27619.0} | |
|
| No log | 4.0 | 164 | 0.4198 | {'precision': 0.6521200866604766, 'recall': 0.5055182341650671, 'f1-score': 0.5695364238410595, 'support': 4168.0} | {'precision': 0.7789709172259508, 'recall': 0.8090148698884758, 'f1-score': 0.7937086847503988, 'support': 2152.0} | {'precision': 0.91675722668985, 'recall': 0.9143724257533059, 'f1-score': 0.9155632732797916, 'support': 9226.0} | {'precision': 0.85398810902633, 'recall': 0.9160937629421022, 'f1-score': 0.8839514066496164, 'support': 12073.0} | 0.8452 | {'precision': 0.8004590849006519, 'recall': 0.7862498231872379, 'f1-score': 0.7906899471302166, 'support': 27619.0} | {'precision': 0.8386466761572305, 'recall': 0.8452152503711213, 'f1-score': 0.8400311740436862, 'support': 27619.0} | |
|
| No log | 5.0 | 205 | 0.4471 | {'precision': 0.5814893617021276, 'recall': 0.6557101727447217, 'f1-score': 0.6163734776725303, 'support': 4168.0} | {'precision': 0.7235804416403786, 'recall': 0.8526951672862454, 'f1-score': 0.7828498293515358, 'support': 2152.0} | {'precision': 0.9300457436126297, 'recall': 0.9035334923043572, 'f1-score': 0.9165979438121942, 'support': 9226.0} | {'precision': 0.9016637478108581, 'recall': 0.8528948894226787, 'f1-score': 0.8766015408845188, 'support': 12073.0} | 0.8400 | {'precision': 0.7841948236914985, 'recall': 0.8162084304395008, 'f1-score': 0.7981056979301948, 'support': 27619.0} | {'precision': 0.8489511288560475, 'recall': 0.8400376552373366, 'f1-score': 0.843386093646175, 'support': 27619.0} | |
|
| No log | 6.0 | 246 | 0.4595 | {'precision': 0.5807517554729451, 'recall': 0.6746641074856046, 'f1-score': 0.6241953385127637, 'support': 4168.0} | {'precision': 0.7883110906580764, 'recall': 0.796003717472119, 'f1-score': 0.7921387283236995, 'support': 2152.0} | {'precision': 0.9110802732707088, 'recall': 0.925102969867765, 'f1-score': 0.9180380767989674, 'support': 9226.0} | {'precision': 0.9042363830544677, 'recall': 0.8415472542035948, 'f1-score': 0.8717662705392766, 'support': 12073.0} | 0.8407 | {'precision': 0.7960948756140495, 'recall': 0.8093295122572709, 'f1-score': 0.8015346035436768, 'support': 27619.0} | {'precision': 0.8486726976979458, 'recall': 0.8407255874579094, 'f1-score': 0.8436577064716956, 'support': 27619.0} | |
|
| No log | 7.0 | 287 | 0.5069 | {'precision': 0.6110236220472441, 'recall': 0.5585412667946257, 'f1-score': 0.5836049135121585, 'support': 4168.0} | {'precision': 0.8053691275167785, 'recall': 0.7806691449814126, 'f1-score': 0.7928268050967437, 'support': 2152.0} | {'precision': 0.9251618566882476, 'recall': 0.9138304790808585, 'f1-score': 0.9194612574295218, 'support': 9226.0} | {'precision': 0.8609833465503569, 'recall': 0.8992793837488611, 'f1-score': 0.8797147834541992, 'support': 12073.0} | 0.8435 | {'precision': 0.8006344882006567, 'recall': 0.7880800686514394, 'f1-score': 0.7939019398731558, 'support': 27619.0} | {'precision': 0.8403669956123412, 'recall': 0.8434773163402006, 'f1-score': 0.8415357075120093, 'support': 27619.0} | |
|
| No log | 8.0 | 328 | 0.5486 | {'precision': 0.5794648982391951, 'recall': 0.6079654510556622, 'f1-score': 0.5933731413183467, 'support': 4168.0} | {'precision': 0.7641959254442999, 'recall': 0.8192379182156134, 'f1-score': 0.7907602601480151, 'support': 2152.0} | {'precision': 0.9482497964879637, 'recall': 0.8838066334272707, 'f1-score': 0.9148948106591865, 'support': 9226.0} | {'precision': 0.86709886547812, 'recall': 0.8862751594466992, 'f1-score': 0.8765821488551183, 'support': 12073.0} | 0.8382 | {'precision': 0.7897523714123946, 'recall': 0.7993212905363115, 'f1-score': 0.7939025902451666, 'support': 27619.0} | {'precision': 0.8427820179127555, 'recall': 0.8382273072884608, 'f1-score': 0.8399540584062747, 'support': 27619.0} | |
|
| No log | 9.0 | 369 | 0.5624 | {'precision': 0.5684468999386126, 'recall': 0.6665067178502879, 'f1-score': 0.6135836554389841, 'support': 4168.0} | {'precision': 0.7784669915817457, 'recall': 0.8164498141263941, 'f1-score': 0.7970061238376048, 'support': 2152.0} | {'precision': 0.9420438957475995, 'recall': 0.893236505527856, 'f1-score': 0.9169912095248693, 'support': 9226.0} | {'precision': 0.8849663170461328, 'recall': 0.8596040752091444, 'f1-score': 0.8721008403361344, 'support': 12073.0} | 0.8383 | {'precision': 0.7934810260785227, 'recall': 0.8089492781784205, 'f1-score': 0.7999204572843982, 'support': 27619.0} | {'precision': 0.8479685351639584, 'recall': 0.8383359281653934, 'f1-score': 0.8422320938058151, 'support': 27619.0} | |
|
| No log | 10.0 | 410 | 0.5923 | {'precision': 0.6067892503536068, 'recall': 0.6175623800383877, 'f1-score': 0.612128418549346, 'support': 4168.0} | {'precision': 0.7623089983022071, 'recall': 0.8345724907063197, 'f1-score': 0.7968056787932565, 'support': 2152.0} | {'precision': 0.9368265850062379, 'recall': 0.8952959028831563, 'f1-score': 0.9155905337249902, 'support': 9226.0} | {'precision': 0.8744673877417241, 'recall': 0.8839559347303901, 'f1-score': 0.879186060880669, 'support': 12073.0} | 0.8437 | {'precision': 0.795098055350944, 'recall': 0.8078466770895635, 'f1-score': 0.8009276729870654, 'support': 27619.0} | {'precision': 0.846163633922067, 'recall': 0.8436945580940657, 'f1-score': 0.8446261141401151, 'support': 27619.0} | |
|
| No log | 11.0 | 451 | 0.6036 | {'precision': 0.5938604240282686, 'recall': 0.6451535508637236, 'f1-score': 0.6184452621895125, 'support': 4168.0} | {'precision': 0.7668161434977578, 'recall': 0.7946096654275093, 'f1-score': 0.7804655408489276, 'support': 2152.0} | {'precision': 0.9390562819783969, 'recall': 0.8951875135486668, 'f1-score': 0.9165973031463293, 'support': 9226.0} | {'precision': 0.8781700646444555, 'recall': 0.8776608962146939, 'f1-score': 0.8779154066034218, 'support': 12073.0} | 0.8420 | {'precision': 0.7944757285372197, 'recall': 0.8031529065136485, 'f1-score': 0.7983558781970478, 'support': 27619.0} | {'precision': 0.8469270804932185, 'recall': 0.841956624063145, 'f1-score': 0.8440870820617664, 'support': 27619.0} | |
|
| No log | 12.0 | 492 | 0.6292 | {'precision': 0.594930767425487, 'recall': 0.6082053742802304, 'f1-score': 0.6014948392454621, 'support': 4168.0} | {'precision': 0.7890961262553802, 'recall': 0.766728624535316, 'f1-score': 0.7777515908555267, 'support': 2152.0} | {'precision': 0.9292805354155047, 'recall': 0.9029915456319099, 'f1-score': 0.9159474465394976, 'support': 9226.0} | {'precision': 0.872541050235734, 'recall': 0.8890913608879317, 'f1-score': 0.8807384615384615, 'support': 12073.0} | 0.8418 | {'precision': 0.7964621198330264, 'recall': 0.791754226333847, 'f1-score': 0.7939830845447369, 'support': 27619.0} | {'precision': 0.8430984692266364, 'recall': 0.8418117962272349, 'f1-score': 0.8423345704559697, 'support': 27619.0} | |
|
| 0.2689 | 13.0 | 533 | 0.6506 | {'precision': 0.6016401590457257, 'recall': 0.5808541266794626, 'f1-score': 0.5910644531250001, 'support': 4168.0} | {'precision': 0.7968977217644208, 'recall': 0.7639405204460966, 'f1-score': 0.7800711743772243, 'support': 2152.0} | {'precision': 0.9178990865593737, 'recall': 0.9149143724257534, 'f1-score': 0.9164042992074695, 'support': 9226.0} | {'precision': 0.8670557717250325, 'recall': 0.8859438416300837, 'f1-score': 0.8763980498996273, 'support': 12073.0} | 0.8401 | {'precision': 0.7958731847736381, 'recall': 0.786413215295349, 'f1-score': 0.7909844941523303, 'support': 27619.0} | {'precision': 0.8385191855162285, 'recall': 0.8400738621963141, 'f1-score': 0.8391965505199718, 'support': 27619.0} | |
|
| 0.2689 | 14.0 | 574 | 0.6476 | {'precision': 0.6124620060790273, 'recall': 0.5801343570057581, 'f1-score': 0.5958600295712174, 'support': 4168.0} | {'precision': 0.77728285077951, 'recall': 0.8108736059479554, 'f1-score': 0.7937229929497385, 'support': 2152.0} | {'precision': 0.9248128577719067, 'recall': 0.9105787990461739, 'f1-score': 0.9176406335335883, 'support': 9226.0} | {'precision': 0.8699562469615946, 'recall': 0.8893398492503934, 'f1-score': 0.8795412656154004, 'support': 12073.0} | 0.8437 | {'precision': 0.7961284903980096, 'recall': 0.7977316528125702, 'f1-score': 0.7966912304174861, 'support': 27619.0} | {'precision': 0.8422013661459805, 'recall': 0.8436583511350881, 'f1-score': 0.8427709427870771, 'support': 27619.0} | |
|
| 0.2689 | 15.0 | 615 | 0.6652 | {'precision': 0.5992348158775705, 'recall': 0.6012476007677543, 'f1-score': 0.6002395209580837, 'support': 4168.0} | {'precision': 0.7798372513562387, 'recall': 0.8015799256505576, 'f1-score': 0.7905591200733272, 'support': 2152.0} | {'precision': 0.9388219240391176, 'recall': 0.8948623455451984, 'f1-score': 0.916315205327414, 'support': 9226.0} | {'precision': 0.8667846512750382, 'recall': 0.8924873685082415, 'f1-score': 0.8794482533463925, 'support': 12073.0} | 0.8422 | {'precision': 0.7961696606369912, 'recall': 0.797544310117938, 'f1-score': 0.7966405249263044, 'support': 27619.0} | {'precision': 0.8436975503647769, 'recall': 0.842246279734965, 'f1-score': 0.842701922471951, 'support': 27619.0} | |
|
| 0.2689 | 16.0 | 656 | 0.6609 | {'precision': 0.6078710289236605, 'recall': 0.6151631477927063, 'f1-score': 0.6114953493918436, 'support': 4168.0} | {'precision': 0.782967032967033, 'recall': 0.7946096654275093, 'f1-score': 0.7887453874538746, 'support': 2152.0} | {'precision': 0.934072084172823, 'recall': 0.9045089963147627, 'f1-score': 0.9190528634361235, 'support': 9226.0} | {'precision': 0.8725067166001791, 'recall': 0.8876832601673155, 'f1-score': 0.8800295615043522, 'support': 12073.0} | 0.8449 | {'precision': 0.7993542156659239, 'recall': 0.8004912674255735, 'f1-score': 0.7998307904465485, 'support': 27619.0} | {'precision': 0.8461593157460915, 'recall': 0.8449255946993012, 'f1-score': 0.8454278324403368, 'support': 27619.0} | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.37.2 |
|
- Pytorch 2.2.0+cu121 |
|
- Datasets 2.17.0 |
|
- Tokenizers 0.15.2 |
|
|