Theoreticallyhugo
commited on
Commit
•
025546c
1
Parent(s):
2722470
trainer: training complete at 2024-03-02 13:40:54.470673.
Browse files- README.md +28 -28
- meta_data/README_s42_e16.md +28 -28
- meta_data/meta_s42_e16_cvi3.json +1 -1
- model.safetensors +1 -1
README.md
CHANGED
@@ -17,12 +17,12 @@ model-index:
|
|
17 |
name: essays_su_g
|
18 |
type: essays_su_g
|
19 |
config: simple
|
20 |
-
split: train[
|
21 |
args: simple
|
22 |
metrics:
|
23 |
- name: Accuracy
|
24 |
type: accuracy
|
25 |
-
value: 0.
|
26 |
---
|
27 |
|
28 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
@@ -32,14 +32,14 @@ should probably proofread and complete it, then remove this comment. -->
|
|
32 |
|
33 |
This model is a fine-tuned version of [allenai/longformer-base-4096](https://huggingface.co/allenai/longformer-base-4096) on the essays_su_g dataset.
|
34 |
It achieves the following results on the evaluation set:
|
35 |
-
- Loss: 0.
|
36 |
-
- Claim: {'precision': 0.
|
37 |
-
- Majorclaim: {'precision': 0.
|
38 |
-
- O: {'precision': 0.
|
39 |
-
- Premise: {'precision': 0.
|
40 |
-
- Accuracy: 0.
|
41 |
-
- Macro avg: {'precision': 0.
|
42 |
-
- Weighted avg: {'precision': 0.
|
43 |
|
44 |
## Model description
|
45 |
|
@@ -68,24 +68,24 @@ The following hyperparameters were used during training:
|
|
68 |
|
69 |
### Training results
|
70 |
|
71 |
-
| Training Loss | Epoch | Step | Validation Loss | Claim
|
72 |
-
|
73 |
-
| No log | 1.0 | 41 | 0.
|
74 |
-
| No log | 2.0 | 82 | 0.
|
75 |
-
| No log | 3.0 | 123 | 0.
|
76 |
-
| No log | 4.0 | 164 | 0.
|
77 |
-
| No log | 5.0 | 205 | 0.
|
78 |
-
| No log | 6.0 | 246 | 0.
|
79 |
-
| No log | 7.0 | 287 | 0.
|
80 |
-
| No log | 8.0 | 328 | 0.
|
81 |
-
| No log | 9.0 | 369 | 0.
|
82 |
-
| No log | 10.0 | 410 | 0.
|
83 |
-
| No log | 11.0 | 451 | 0.
|
84 |
-
| No log | 12.0 | 492 | 0.
|
85 |
-
| 0.
|
86 |
-
| 0.
|
87 |
-
| 0.
|
88 |
-
| 0.
|
89 |
|
90 |
|
91 |
### Framework versions
|
|
|
17 |
name: essays_su_g
|
18 |
type: essays_su_g
|
19 |
config: simple
|
20 |
+
split: train[60%:80%]
|
21 |
args: simple
|
22 |
metrics:
|
23 |
- name: Accuracy
|
24 |
type: accuracy
|
25 |
+
value: 0.858776119402985
|
26 |
---
|
27 |
|
28 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
|
|
32 |
|
33 |
This model is a fine-tuned version of [allenai/longformer-base-4096](https://huggingface.co/allenai/longformer-base-4096) on the essays_su_g dataset.
|
34 |
It achieves the following results on the evaluation set:
|
35 |
+
- Loss: 0.6472
|
36 |
+
- Claim: {'precision': 0.6572622779519331, 'recall': 0.6366396761133604, 'f1-score': 0.6467866323907456, 'support': 4940.0}
|
37 |
+
- Majorclaim: {'precision': 0.8274678111587983, 'recall': 0.8811700182815356, 'f1-score': 0.8534749889331562, 'support': 2188.0}
|
38 |
+
- O: {'precision': 0.9268028016178357, 'recall': 0.8970686527260575, 'f1-score': 0.9116933527413877, 'support': 10473.0}
|
39 |
+
- Premise: {'precision': 0.8801698670605613, 'recall': 0.8994905339958488, 'f1-score': 0.8897253242915357, 'support': 15899.0}
|
40 |
+
- Accuracy: 0.8588
|
41 |
+
- Macro avg: {'precision': 0.8229256894472821, 'recall': 0.8285922202792007, 'f1-score': 0.8254200745892063, 'support': 33500.0}
|
42 |
+
- Weighted avg: {'precision': 0.8584358710936555, 'recall': 0.858776119402985, 'f1-score': 0.8584010941482899, 'support': 33500.0}
|
43 |
|
44 |
## Model description
|
45 |
|
|
|
68 |
|
69 |
### Training results
|
70 |
|
71 |
+
| Training Loss | Epoch | Step | Validation Loss | Claim | Majorclaim | O | Premise | Accuracy | Macro avg | Weighted avg |
|
72 |
+
|:-------------:|:-----:|:----:|:---------------:|:-------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:--------:|:-------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|
|
73 |
+
| No log | 1.0 | 41 | 0.6237 | {'precision': 0.4813399941228328, 'recall': 0.33157894736842103, 'f1-score': 0.3926645091693635, 'support': 4940.0} | {'precision': 0.41758530183727033, 'recall': 0.7271480804387569, 'f1-score': 0.5305101700566855, 'support': 2188.0} | {'precision': 0.8614998552263295, 'recall': 0.8522868328081734, 'f1-score': 0.8568685802054334, 'support': 10473.0} | {'precision': 0.8528192892126083, 'recall': 0.8542675639977357, 'f1-score': 0.8535428122545169, 'support': 15899.0} | 0.7683 | {'precision': 0.6533111100997602, 'recall': 0.6913203561532717, 'f1-score': 0.6583965179214999, 'support': 33500.0} | {'precision': 0.7723271066974134, 'recall': 0.7682686567164179, 'f1-score': 0.7655218131315448, 'support': 33500.0} |
|
74 |
+
| No log | 2.0 | 82 | 0.4751 | {'precision': 0.5846230654018971, 'recall': 0.47408906882591095, 'f1-score': 0.5235859602056785, 'support': 4940.0} | {'precision': 0.7269767441860465, 'recall': 0.7143510054844607, 'f1-score': 0.7206085753803596, 'support': 2188.0} | {'precision': 0.9142337609859582, 'recall': 0.8641268022534135, 'f1-score': 0.8884743765953269, 'support': 10473.0} | {'precision': 0.8357695614789338, 'recall': 0.917038807472168, 'f1-score': 0.8745201535508637, 'support': 15899.0} | 0.8219 | {'precision': 0.7654007830132088, 'recall': 0.7424014210089883, 'f1-score': 0.7517972664330571, 'support': 33500.0} | {'precision': 0.816159208839521, 'recall': 0.8219402985074626, 'f1-score': 0.8170804260816812, 'support': 33500.0} |
|
75 |
+
| No log | 3.0 | 123 | 0.4586 | {'precision': 0.6658894070619586, 'recall': 0.4046558704453441, 'f1-score': 0.5033996474439688, 'support': 4940.0} | {'precision': 0.7872244714349977, 'recall': 0.79981718464351, 'f1-score': 0.7934708682838358, 'support': 2188.0} | {'precision': 0.9342819121711536, 'recall': 0.8714790413444095, 'f1-score': 0.9017883608339096, 'support': 10473.0} | {'precision': 0.8168702042580784, 'recall': 0.9508145166362665, 'f1-score': 0.8787676209853219, 'support': 15899.0} | 0.8356 | {'precision': 0.8010664987315472, 'recall': 0.7566916532673825, 'f1-score': 0.7693566243867591, 'support': 33500.0} | {'precision': 0.8293759599418965, 'recall': 0.8356119402985075, 'f1-score': 0.8250407291712659, 'support': 33500.0} |
|
76 |
+
| No log | 4.0 | 164 | 0.4525 | {'precision': 0.5575898801597869, 'recall': 0.6781376518218624, 'f1-score': 0.6119839240043845, 'support': 4940.0} | {'precision': 0.7466456195737964, 'recall': 0.8647166361974405, 'f1-score': 0.8013553578991952, 'support': 2188.0} | {'precision': 0.9201592832254853, 'recall': 0.8825551417931825, 'f1-score': 0.9009650063359004, 'support': 10473.0} | {'precision': 0.8922416683430564, 'recall': 0.836907981634065, 'f1-score': 0.8636894716344281, 'support': 15899.0} | 0.8296 | {'precision': 0.7791591128255312, 'recall': 0.8155793528616375, 'f1-score': 0.7944984399684771, 'support': 33500.0} | {'precision': 0.8421114352783158, 'recall': 0.8295820895522388, 'f1-score': 0.8341543739861718, 'support': 33500.0} |
|
77 |
+
| No log | 5.0 | 205 | 0.4721 | {'precision': 0.662877030162413, 'recall': 0.5783400809716599, 'f1-score': 0.6177297297297297, 'support': 4940.0} | {'precision': 0.7945205479452054, 'recall': 0.8747714808043876, 'f1-score': 0.8327169893408746, 'support': 2188.0} | {'precision': 0.9125229313507772, 'recall': 0.9024157357013273, 'f1-score': 0.9074411905904946, 'support': 10473.0} | {'precision': 0.8726254262055528, 'recall': 0.9014403421598842, 'f1-score': 0.8867988738669058, 'support': 15899.0} | 0.8524 | {'precision': 0.8106364839159872, 'recall': 0.8142419099093148, 'f1-score': 0.8111716958820011, 'support': 33500.0} | {'precision': 0.8490670984831405, 'recall': 0.8523582089552239, 'f1-score': 0.8500422842449816, 'support': 33500.0} |
|
78 |
+
| No log | 6.0 | 246 | 0.4792 | {'precision': 0.6428419936373276, 'recall': 0.6135627530364373, 'f1-score': 0.6278612118073537, 'support': 4940.0} | {'precision': 0.804950917626974, 'recall': 0.8619744058500914, 'f1-score': 0.83248730964467, 'support': 2188.0} | {'precision': 0.9285714285714286, 'recall': 0.8949680129857729, 'f1-score': 0.9114601059950406, 'support': 10473.0} | {'precision': 0.872155615365794, 'recall': 0.8967859613812189, 'f1-score': 0.8842993146649301, 'support': 15899.0} | 0.8522 | {'precision': 0.812129988800381, 'recall': 0.8168227833133802, 'f1-score': 0.8140269855279986, 'support': 33500.0} | {'precision': 0.8515881419840463, 'recall': 0.8521791044776119, 'f1-score': 0.8515914362320791, 'support': 33500.0} |
|
79 |
+
| No log | 7.0 | 287 | 0.5202 | {'precision': 0.6744186046511628, 'recall': 0.5342105263157895, 'f1-score': 0.5961820851688694, 'support': 4940.0} | {'precision': 0.8121475054229935, 'recall': 0.8555758683729433, 'f1-score': 0.8332962385933673, 'support': 2188.0} | {'precision': 0.9198786930150655, 'recall': 0.8978325217225246, 'f1-score': 0.9087219135056778, 'support': 10473.0} | {'precision': 0.8582063305978898, 'recall': 0.9208755267626895, 'f1-score': 0.8884371491853515, 'support': 15899.0} | 0.8524 | {'precision': 0.816162783421778, 'recall': 0.8021236107934867, 'f1-score': 0.8066593466133165, 'support': 33500.0} | {'precision': 0.8473766761482054, 'recall': 0.8523880597014926, 'f1-score': 0.8480805524125185, 'support': 33500.0} |
|
80 |
+
| No log | 8.0 | 328 | 0.5458 | {'precision': 0.6705622932745314, 'recall': 0.6155870445344129, 'f1-score': 0.6418997361477573, 'support': 4940.0} | {'precision': 0.8129251700680272, 'recall': 0.8738574040219378, 'f1-score': 0.8422907488986784, 'support': 2188.0} | {'precision': 0.9259259259259259, 'recall': 0.89277188962093, 'f1-score': 0.909046716251033, 'support': 10473.0} | {'precision': 0.8728428701180745, 'recall': 0.9066607962764954, 'f1-score': 0.8894304929968533, 'support': 15899.0} | 0.8573 | {'precision': 0.8205640648466398, 'recall': 0.822219283613444, 'f1-score': 0.8206669235735804, 'support': 33500.0} | {'precision': 0.8556957914959558, 'recall': 0.8572537313432835, 'f1-score': 0.8559826424660975, 'support': 33500.0} |
|
81 |
+
| No log | 9.0 | 369 | 0.5550 | {'precision': 0.6423661737138097, 'recall': 0.6242914979757085, 'f1-score': 0.6331998768093625, 'support': 4940.0} | {'precision': 0.8291592128801432, 'recall': 0.8473491773308958, 'f1-score': 0.8381555153707052, 'support': 2188.0} | {'precision': 0.909720885466795, 'recall': 0.9025112193258856, 'f1-score': 0.9061017111633034, 'support': 10473.0} | {'precision': 0.8796739874323399, 'recall': 0.8893012139128247, 'f1-score': 0.8844614037282621, 'support': 15899.0} | 0.8516 | {'precision': 0.8152300648732719, 'recall': 0.8158632771363286, 'f1-score': 0.8154796267679083, 'support': 33500.0} | {'precision': 0.8507741138987609, 'recall': 0.8516119402985075, 'f1-score': 0.8511506488942767, 'support': 33500.0} |
|
82 |
+
| No log | 10.0 | 410 | 0.5788 | {'precision': 0.6611198560827524, 'recall': 0.5951417004048583, 'f1-score': 0.6263982102908278, 'support': 4940.0} | {'precision': 0.8315460232350312, 'recall': 0.8505484460694699, 'f1-score': 0.8409399005874378, 'support': 2188.0} | {'precision': 0.9248446592366111, 'recall': 0.8953499474840065, 'f1-score': 0.9098583349505143, 'support': 10473.0} | {'precision': 0.8645358599184456, 'recall': 0.9067865903515945, 'f1-score': 0.8851573292402148, 'support': 15899.0} | 0.8536 | {'precision': 0.8205115996182102, 'recall': 0.8119566710774824, 'f1-score': 0.8155884437672487, 'support': 33500.0} | {'precision': 0.851239060922849, 'recall': 0.8535820895522388, 'f1-score': 0.8518342203238483, 'support': 33500.0} |
|
83 |
+
| No log | 11.0 | 451 | 0.5865 | {'precision': 0.661878453038674, 'recall': 0.6062753036437247, 'f1-score': 0.6328578975171685, 'support': 4940.0} | {'precision': 0.829535495179667, 'recall': 0.8651736745886655, 'f1-score': 0.8469798657718122, 'support': 2188.0} | {'precision': 0.9291244788564622, 'recall': 0.8937267258665139, 'f1-score': 0.9110819097678493, 'support': 10473.0} | {'precision': 0.8703893134364282, 'recall': 0.9098056481539719, 'f1-score': 0.88966111076942, 'support': 15899.0} | 0.8571 | {'precision': 0.8227319351278078, 'recall': 0.818745338063219, 'f1-score': 0.8201451959565625, 'support': 33500.0} | {'precision': 0.8553356293389153, 'recall': 0.8571044776119403, 'f1-score': 0.8557012776467233, 'support': 33500.0} |
|
84 |
+
| No log | 12.0 | 492 | 0.6140 | {'precision': 0.6268885064065787, 'recall': 0.6635627530364372, 'f1-score': 0.6447044940505456, 'support': 4940.0} | {'precision': 0.8325078793336335, 'recall': 0.8450639853747715, 'f1-score': 0.8387389430709912, 'support': 2188.0} | {'precision': 0.923546196989078, 'recall': 0.896209300105032, 'f1-score': 0.9096724171351037, 'support': 10473.0} | {'precision': 0.885440926543715, 'recall': 0.8847726272092584, 'f1-score': 0.885106650726735, 'support': 15899.0} | 0.8531 | {'precision': 0.8170958773182513, 'recall': 0.8224021664313748, 'f1-score': 0.8195556262458439, 'support': 33500.0} | {'precision': 0.8557695842930038, 'recall': 0.8531343283582089, 'f1-score': 0.8543077872420695, 'support': 33500.0} |
|
85 |
+
| 0.2701 | 13.0 | 533 | 0.6368 | {'precision': 0.6831773567678612, 'recall': 0.6058704453441296, 'f1-score': 0.642205771912885, 'support': 4940.0} | {'precision': 0.8174536256323778, 'recall': 0.8861974405850092, 'f1-score': 0.8504385964912281, 'support': 2188.0} | {'precision': 0.9274289099526066, 'recall': 0.8968776854769407, 'f1-score': 0.9118974807048201, 'support': 10473.0} | {'precision': 0.8733377459534268, 'recall': 0.912887602993899, 'f1-score': 0.892674826250077, 'support': 15899.0} | 0.8609 | {'precision': 0.8253494095765681, 'recall': 0.8254582935999946, 'f1-score': 0.8243041688397525, 'support': 33500.0} | {'precision': 0.8585565514078823, 'recall': 0.8608656716417911, 'f1-score': 0.8589909116520602, 'support': 33500.0} |
|
86 |
+
| 0.2701 | 14.0 | 574 | 0.6486 | {'precision': 0.6641386782231853, 'recall': 0.6204453441295547, 'f1-score': 0.641548927263213, 'support': 4940.0} | {'precision': 0.8142076502732241, 'recall': 0.8852833638025595, 'f1-score': 0.8482592511495511, 'support': 2188.0} | {'precision': 0.9240070782540307, 'recall': 0.897450587224291, 'f1-score': 0.9105352385565513, 'support': 10473.0} | {'precision': 0.8767601322395004, 'recall': 0.9007484747468394, 'f1-score': 0.888592436323023, 'support': 15899.0} | 0.8574 | {'precision': 0.8197783847474851, 'recall': 0.8259819424758111, 'f1-score': 0.8222339633230846, 'support': 33500.0} | {'precision': 0.8560915487238994, 'recall': 0.8573731343283582, 'f1-score': 0.8563883474835222, 'support': 33500.0} |
|
87 |
+
| 0.2701 | 15.0 | 615 | 0.6462 | {'precision': 0.6603214890016921, 'recall': 0.6319838056680162, 'f1-score': 0.6458419528340918, 'support': 4940.0} | {'precision': 0.8342832091188075, 'recall': 0.8697440585009141, 'f1-score': 0.8516446632356232, 'support': 2188.0} | {'precision': 0.9237646134197859, 'recall': 0.8978325217225246, 'f1-score': 0.9106139841177611, 'support': 10473.0} | {'precision': 0.8785556645414418, 'recall': 0.9013774451223348, 'f1-score': 0.8898202477414549, 'support': 15899.0} | 0.8585 | {'precision': 0.8242312440204318, 'recall': 0.8252344577534474, 'f1-score': 0.8244802119822328, 'support': 33500.0} | {'precision': 0.8576162126600033, 'recall': 0.8584776119402985, 'f1-score': 0.8578498550646764, 'support': 33500.0} |
|
88 |
+
| 0.2701 | 16.0 | 656 | 0.6472 | {'precision': 0.6572622779519331, 'recall': 0.6366396761133604, 'f1-score': 0.6467866323907456, 'support': 4940.0} | {'precision': 0.8274678111587983, 'recall': 0.8811700182815356, 'f1-score': 0.8534749889331562, 'support': 2188.0} | {'precision': 0.9268028016178357, 'recall': 0.8970686527260575, 'f1-score': 0.9116933527413877, 'support': 10473.0} | {'precision': 0.8801698670605613, 'recall': 0.8994905339958488, 'f1-score': 0.8897253242915357, 'support': 15899.0} | 0.8588 | {'precision': 0.8229256894472821, 'recall': 0.8285922202792007, 'f1-score': 0.8254200745892063, 'support': 33500.0} | {'precision': 0.8584358710936555, 'recall': 0.858776119402985, 'f1-score': 0.8584010941482899, 'support': 33500.0} |
|
89 |
|
90 |
|
91 |
### Framework versions
|
meta_data/README_s42_e16.md
CHANGED
@@ -17,12 +17,12 @@ model-index:
|
|
17 |
name: essays_su_g
|
18 |
type: essays_su_g
|
19 |
config: simple
|
20 |
-
split: train[
|
21 |
args: simple
|
22 |
metrics:
|
23 |
- name: Accuracy
|
24 |
type: accuracy
|
25 |
-
value: 0.
|
26 |
---
|
27 |
|
28 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
@@ -32,14 +32,14 @@ should probably proofread and complete it, then remove this comment. -->
|
|
32 |
|
33 |
This model is a fine-tuned version of [allenai/longformer-base-4096](https://huggingface.co/allenai/longformer-base-4096) on the essays_su_g dataset.
|
34 |
It achieves the following results on the evaluation set:
|
35 |
-
- Loss: 0.
|
36 |
-
- Claim: {'precision': 0.
|
37 |
-
- Majorclaim: {'precision': 0.
|
38 |
-
- O: {'precision': 0.
|
39 |
-
- Premise: {'precision': 0.
|
40 |
-
- Accuracy: 0.
|
41 |
-
- Macro avg: {'precision': 0.
|
42 |
-
- Weighted avg: {'precision': 0.
|
43 |
|
44 |
## Model description
|
45 |
|
@@ -68,24 +68,24 @@ The following hyperparameters were used during training:
|
|
68 |
|
69 |
### Training results
|
70 |
|
71 |
-
| Training Loss | Epoch | Step | Validation Loss | Claim
|
72 |
-
|
73 |
-
| No log | 1.0 | 41 | 0.
|
74 |
-
| No log | 2.0 | 82 | 0.
|
75 |
-
| No log | 3.0 | 123 | 0.
|
76 |
-
| No log | 4.0 | 164 | 0.
|
77 |
-
| No log | 5.0 | 205 | 0.
|
78 |
-
| No log | 6.0 | 246 | 0.
|
79 |
-
| No log | 7.0 | 287 | 0.
|
80 |
-
| No log | 8.0 | 328 | 0.
|
81 |
-
| No log | 9.0 | 369 | 0.
|
82 |
-
| No log | 10.0 | 410 | 0.
|
83 |
-
| No log | 11.0 | 451 | 0.
|
84 |
-
| No log | 12.0 | 492 | 0.
|
85 |
-
| 0.
|
86 |
-
| 0.
|
87 |
-
| 0.
|
88 |
-
| 0.
|
89 |
|
90 |
|
91 |
### Framework versions
|
|
|
17 |
name: essays_su_g
|
18 |
type: essays_su_g
|
19 |
config: simple
|
20 |
+
split: train[60%:80%]
|
21 |
args: simple
|
22 |
metrics:
|
23 |
- name: Accuracy
|
24 |
type: accuracy
|
25 |
+
value: 0.858776119402985
|
26 |
---
|
27 |
|
28 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
|
|
32 |
|
33 |
This model is a fine-tuned version of [allenai/longformer-base-4096](https://huggingface.co/allenai/longformer-base-4096) on the essays_su_g dataset.
|
34 |
It achieves the following results on the evaluation set:
|
35 |
+
- Loss: 0.6472
|
36 |
+
- Claim: {'precision': 0.6572622779519331, 'recall': 0.6366396761133604, 'f1-score': 0.6467866323907456, 'support': 4940.0}
|
37 |
+
- Majorclaim: {'precision': 0.8274678111587983, 'recall': 0.8811700182815356, 'f1-score': 0.8534749889331562, 'support': 2188.0}
|
38 |
+
- O: {'precision': 0.9268028016178357, 'recall': 0.8970686527260575, 'f1-score': 0.9116933527413877, 'support': 10473.0}
|
39 |
+
- Premise: {'precision': 0.8801698670605613, 'recall': 0.8994905339958488, 'f1-score': 0.8897253242915357, 'support': 15899.0}
|
40 |
+
- Accuracy: 0.8588
|
41 |
+
- Macro avg: {'precision': 0.8229256894472821, 'recall': 0.8285922202792007, 'f1-score': 0.8254200745892063, 'support': 33500.0}
|
42 |
+
- Weighted avg: {'precision': 0.8584358710936555, 'recall': 0.858776119402985, 'f1-score': 0.8584010941482899, 'support': 33500.0}
|
43 |
|
44 |
## Model description
|
45 |
|
|
|
68 |
|
69 |
### Training results
|
70 |
|
71 |
+
| Training Loss | Epoch | Step | Validation Loss | Claim | Majorclaim | O | Premise | Accuracy | Macro avg | Weighted avg |
|
72 |
+
|:-------------:|:-----:|:----:|:---------------:|:-------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:--------:|:-------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|
|
73 |
+
| No log | 1.0 | 41 | 0.6237 | {'precision': 0.4813399941228328, 'recall': 0.33157894736842103, 'f1-score': 0.3926645091693635, 'support': 4940.0} | {'precision': 0.41758530183727033, 'recall': 0.7271480804387569, 'f1-score': 0.5305101700566855, 'support': 2188.0} | {'precision': 0.8614998552263295, 'recall': 0.8522868328081734, 'f1-score': 0.8568685802054334, 'support': 10473.0} | {'precision': 0.8528192892126083, 'recall': 0.8542675639977357, 'f1-score': 0.8535428122545169, 'support': 15899.0} | 0.7683 | {'precision': 0.6533111100997602, 'recall': 0.6913203561532717, 'f1-score': 0.6583965179214999, 'support': 33500.0} | {'precision': 0.7723271066974134, 'recall': 0.7682686567164179, 'f1-score': 0.7655218131315448, 'support': 33500.0} |
|
74 |
+
| No log | 2.0 | 82 | 0.4751 | {'precision': 0.5846230654018971, 'recall': 0.47408906882591095, 'f1-score': 0.5235859602056785, 'support': 4940.0} | {'precision': 0.7269767441860465, 'recall': 0.7143510054844607, 'f1-score': 0.7206085753803596, 'support': 2188.0} | {'precision': 0.9142337609859582, 'recall': 0.8641268022534135, 'f1-score': 0.8884743765953269, 'support': 10473.0} | {'precision': 0.8357695614789338, 'recall': 0.917038807472168, 'f1-score': 0.8745201535508637, 'support': 15899.0} | 0.8219 | {'precision': 0.7654007830132088, 'recall': 0.7424014210089883, 'f1-score': 0.7517972664330571, 'support': 33500.0} | {'precision': 0.816159208839521, 'recall': 0.8219402985074626, 'f1-score': 0.8170804260816812, 'support': 33500.0} |
|
75 |
+
| No log | 3.0 | 123 | 0.4586 | {'precision': 0.6658894070619586, 'recall': 0.4046558704453441, 'f1-score': 0.5033996474439688, 'support': 4940.0} | {'precision': 0.7872244714349977, 'recall': 0.79981718464351, 'f1-score': 0.7934708682838358, 'support': 2188.0} | {'precision': 0.9342819121711536, 'recall': 0.8714790413444095, 'f1-score': 0.9017883608339096, 'support': 10473.0} | {'precision': 0.8168702042580784, 'recall': 0.9508145166362665, 'f1-score': 0.8787676209853219, 'support': 15899.0} | 0.8356 | {'precision': 0.8010664987315472, 'recall': 0.7566916532673825, 'f1-score': 0.7693566243867591, 'support': 33500.0} | {'precision': 0.8293759599418965, 'recall': 0.8356119402985075, 'f1-score': 0.8250407291712659, 'support': 33500.0} |
|
76 |
+
| No log | 4.0 | 164 | 0.4525 | {'precision': 0.5575898801597869, 'recall': 0.6781376518218624, 'f1-score': 0.6119839240043845, 'support': 4940.0} | {'precision': 0.7466456195737964, 'recall': 0.8647166361974405, 'f1-score': 0.8013553578991952, 'support': 2188.0} | {'precision': 0.9201592832254853, 'recall': 0.8825551417931825, 'f1-score': 0.9009650063359004, 'support': 10473.0} | {'precision': 0.8922416683430564, 'recall': 0.836907981634065, 'f1-score': 0.8636894716344281, 'support': 15899.0} | 0.8296 | {'precision': 0.7791591128255312, 'recall': 0.8155793528616375, 'f1-score': 0.7944984399684771, 'support': 33500.0} | {'precision': 0.8421114352783158, 'recall': 0.8295820895522388, 'f1-score': 0.8341543739861718, 'support': 33500.0} |
|
77 |
+
| No log | 5.0 | 205 | 0.4721 | {'precision': 0.662877030162413, 'recall': 0.5783400809716599, 'f1-score': 0.6177297297297297, 'support': 4940.0} | {'precision': 0.7945205479452054, 'recall': 0.8747714808043876, 'f1-score': 0.8327169893408746, 'support': 2188.0} | {'precision': 0.9125229313507772, 'recall': 0.9024157357013273, 'f1-score': 0.9074411905904946, 'support': 10473.0} | {'precision': 0.8726254262055528, 'recall': 0.9014403421598842, 'f1-score': 0.8867988738669058, 'support': 15899.0} | 0.8524 | {'precision': 0.8106364839159872, 'recall': 0.8142419099093148, 'f1-score': 0.8111716958820011, 'support': 33500.0} | {'precision': 0.8490670984831405, 'recall': 0.8523582089552239, 'f1-score': 0.8500422842449816, 'support': 33500.0} |
|
78 |
+
| No log | 6.0 | 246 | 0.4792 | {'precision': 0.6428419936373276, 'recall': 0.6135627530364373, 'f1-score': 0.6278612118073537, 'support': 4940.0} | {'precision': 0.804950917626974, 'recall': 0.8619744058500914, 'f1-score': 0.83248730964467, 'support': 2188.0} | {'precision': 0.9285714285714286, 'recall': 0.8949680129857729, 'f1-score': 0.9114601059950406, 'support': 10473.0} | {'precision': 0.872155615365794, 'recall': 0.8967859613812189, 'f1-score': 0.8842993146649301, 'support': 15899.0} | 0.8522 | {'precision': 0.812129988800381, 'recall': 0.8168227833133802, 'f1-score': 0.8140269855279986, 'support': 33500.0} | {'precision': 0.8515881419840463, 'recall': 0.8521791044776119, 'f1-score': 0.8515914362320791, 'support': 33500.0} |
|
79 |
+
| No log | 7.0 | 287 | 0.5202 | {'precision': 0.6744186046511628, 'recall': 0.5342105263157895, 'f1-score': 0.5961820851688694, 'support': 4940.0} | {'precision': 0.8121475054229935, 'recall': 0.8555758683729433, 'f1-score': 0.8332962385933673, 'support': 2188.0} | {'precision': 0.9198786930150655, 'recall': 0.8978325217225246, 'f1-score': 0.9087219135056778, 'support': 10473.0} | {'precision': 0.8582063305978898, 'recall': 0.9208755267626895, 'f1-score': 0.8884371491853515, 'support': 15899.0} | 0.8524 | {'precision': 0.816162783421778, 'recall': 0.8021236107934867, 'f1-score': 0.8066593466133165, 'support': 33500.0} | {'precision': 0.8473766761482054, 'recall': 0.8523880597014926, 'f1-score': 0.8480805524125185, 'support': 33500.0} |
|
80 |
+
| No log | 8.0 | 328 | 0.5458 | {'precision': 0.6705622932745314, 'recall': 0.6155870445344129, 'f1-score': 0.6418997361477573, 'support': 4940.0} | {'precision': 0.8129251700680272, 'recall': 0.8738574040219378, 'f1-score': 0.8422907488986784, 'support': 2188.0} | {'precision': 0.9259259259259259, 'recall': 0.89277188962093, 'f1-score': 0.909046716251033, 'support': 10473.0} | {'precision': 0.8728428701180745, 'recall': 0.9066607962764954, 'f1-score': 0.8894304929968533, 'support': 15899.0} | 0.8573 | {'precision': 0.8205640648466398, 'recall': 0.822219283613444, 'f1-score': 0.8206669235735804, 'support': 33500.0} | {'precision': 0.8556957914959558, 'recall': 0.8572537313432835, 'f1-score': 0.8559826424660975, 'support': 33500.0} |
|
81 |
+
| No log | 9.0 | 369 | 0.5550 | {'precision': 0.6423661737138097, 'recall': 0.6242914979757085, 'f1-score': 0.6331998768093625, 'support': 4940.0} | {'precision': 0.8291592128801432, 'recall': 0.8473491773308958, 'f1-score': 0.8381555153707052, 'support': 2188.0} | {'precision': 0.909720885466795, 'recall': 0.9025112193258856, 'f1-score': 0.9061017111633034, 'support': 10473.0} | {'precision': 0.8796739874323399, 'recall': 0.8893012139128247, 'f1-score': 0.8844614037282621, 'support': 15899.0} | 0.8516 | {'precision': 0.8152300648732719, 'recall': 0.8158632771363286, 'f1-score': 0.8154796267679083, 'support': 33500.0} | {'precision': 0.8507741138987609, 'recall': 0.8516119402985075, 'f1-score': 0.8511506488942767, 'support': 33500.0} |
|
82 |
+
| No log | 10.0 | 410 | 0.5788 | {'precision': 0.6611198560827524, 'recall': 0.5951417004048583, 'f1-score': 0.6263982102908278, 'support': 4940.0} | {'precision': 0.8315460232350312, 'recall': 0.8505484460694699, 'f1-score': 0.8409399005874378, 'support': 2188.0} | {'precision': 0.9248446592366111, 'recall': 0.8953499474840065, 'f1-score': 0.9098583349505143, 'support': 10473.0} | {'precision': 0.8645358599184456, 'recall': 0.9067865903515945, 'f1-score': 0.8851573292402148, 'support': 15899.0} | 0.8536 | {'precision': 0.8205115996182102, 'recall': 0.8119566710774824, 'f1-score': 0.8155884437672487, 'support': 33500.0} | {'precision': 0.851239060922849, 'recall': 0.8535820895522388, 'f1-score': 0.8518342203238483, 'support': 33500.0} |
|
83 |
+
| No log | 11.0 | 451 | 0.5865 | {'precision': 0.661878453038674, 'recall': 0.6062753036437247, 'f1-score': 0.6328578975171685, 'support': 4940.0} | {'precision': 0.829535495179667, 'recall': 0.8651736745886655, 'f1-score': 0.8469798657718122, 'support': 2188.0} | {'precision': 0.9291244788564622, 'recall': 0.8937267258665139, 'f1-score': 0.9110819097678493, 'support': 10473.0} | {'precision': 0.8703893134364282, 'recall': 0.9098056481539719, 'f1-score': 0.88966111076942, 'support': 15899.0} | 0.8571 | {'precision': 0.8227319351278078, 'recall': 0.818745338063219, 'f1-score': 0.8201451959565625, 'support': 33500.0} | {'precision': 0.8553356293389153, 'recall': 0.8571044776119403, 'f1-score': 0.8557012776467233, 'support': 33500.0} |
|
84 |
+
| No log | 12.0 | 492 | 0.6140 | {'precision': 0.6268885064065787, 'recall': 0.6635627530364372, 'f1-score': 0.6447044940505456, 'support': 4940.0} | {'precision': 0.8325078793336335, 'recall': 0.8450639853747715, 'f1-score': 0.8387389430709912, 'support': 2188.0} | {'precision': 0.923546196989078, 'recall': 0.896209300105032, 'f1-score': 0.9096724171351037, 'support': 10473.0} | {'precision': 0.885440926543715, 'recall': 0.8847726272092584, 'f1-score': 0.885106650726735, 'support': 15899.0} | 0.8531 | {'precision': 0.8170958773182513, 'recall': 0.8224021664313748, 'f1-score': 0.8195556262458439, 'support': 33500.0} | {'precision': 0.8557695842930038, 'recall': 0.8531343283582089, 'f1-score': 0.8543077872420695, 'support': 33500.0} |
|
85 |
+
| 0.2701 | 13.0 | 533 | 0.6368 | {'precision': 0.6831773567678612, 'recall': 0.6058704453441296, 'f1-score': 0.642205771912885, 'support': 4940.0} | {'precision': 0.8174536256323778, 'recall': 0.8861974405850092, 'f1-score': 0.8504385964912281, 'support': 2188.0} | {'precision': 0.9274289099526066, 'recall': 0.8968776854769407, 'f1-score': 0.9118974807048201, 'support': 10473.0} | {'precision': 0.8733377459534268, 'recall': 0.912887602993899, 'f1-score': 0.892674826250077, 'support': 15899.0} | 0.8609 | {'precision': 0.8253494095765681, 'recall': 0.8254582935999946, 'f1-score': 0.8243041688397525, 'support': 33500.0} | {'precision': 0.8585565514078823, 'recall': 0.8608656716417911, 'f1-score': 0.8589909116520602, 'support': 33500.0} |
|
86 |
+
| 0.2701 | 14.0 | 574 | 0.6486 | {'precision': 0.6641386782231853, 'recall': 0.6204453441295547, 'f1-score': 0.641548927263213, 'support': 4940.0} | {'precision': 0.8142076502732241, 'recall': 0.8852833638025595, 'f1-score': 0.8482592511495511, 'support': 2188.0} | {'precision': 0.9240070782540307, 'recall': 0.897450587224291, 'f1-score': 0.9105352385565513, 'support': 10473.0} | {'precision': 0.8767601322395004, 'recall': 0.9007484747468394, 'f1-score': 0.888592436323023, 'support': 15899.0} | 0.8574 | {'precision': 0.8197783847474851, 'recall': 0.8259819424758111, 'f1-score': 0.8222339633230846, 'support': 33500.0} | {'precision': 0.8560915487238994, 'recall': 0.8573731343283582, 'f1-score': 0.8563883474835222, 'support': 33500.0} |
|
87 |
+
| 0.2701 | 15.0 | 615 | 0.6462 | {'precision': 0.6603214890016921, 'recall': 0.6319838056680162, 'f1-score': 0.6458419528340918, 'support': 4940.0} | {'precision': 0.8342832091188075, 'recall': 0.8697440585009141, 'f1-score': 0.8516446632356232, 'support': 2188.0} | {'precision': 0.9237646134197859, 'recall': 0.8978325217225246, 'f1-score': 0.9106139841177611, 'support': 10473.0} | {'precision': 0.8785556645414418, 'recall': 0.9013774451223348, 'f1-score': 0.8898202477414549, 'support': 15899.0} | 0.8585 | {'precision': 0.8242312440204318, 'recall': 0.8252344577534474, 'f1-score': 0.8244802119822328, 'support': 33500.0} | {'precision': 0.8576162126600033, 'recall': 0.8584776119402985, 'f1-score': 0.8578498550646764, 'support': 33500.0} |
|
88 |
+
| 0.2701 | 16.0 | 656 | 0.6472 | {'precision': 0.6572622779519331, 'recall': 0.6366396761133604, 'f1-score': 0.6467866323907456, 'support': 4940.0} | {'precision': 0.8274678111587983, 'recall': 0.8811700182815356, 'f1-score': 0.8534749889331562, 'support': 2188.0} | {'precision': 0.9268028016178357, 'recall': 0.8970686527260575, 'f1-score': 0.9116933527413877, 'support': 10473.0} | {'precision': 0.8801698670605613, 'recall': 0.8994905339958488, 'f1-score': 0.8897253242915357, 'support': 15899.0} | 0.8588 | {'precision': 0.8229256894472821, 'recall': 0.8285922202792007, 'f1-score': 0.8254200745892063, 'support': 33500.0} | {'precision': 0.8584358710936555, 'recall': 0.858776119402985, 'f1-score': 0.8584010941482899, 'support': 33500.0} |
|
89 |
|
90 |
|
91 |
### Framework versions
|
meta_data/meta_s42_e16_cvi3.json
CHANGED
@@ -1 +1 @@
|
|
1 |
-
{"Claim": {"precision": 0.
|
|
|
1 |
+
{"Claim": {"precision": 0.6572622779519331, "recall": 0.6366396761133604, "f1-score": 0.6467866323907456, "support": 4940.0}, "MajorClaim": {"precision": 0.8274678111587983, "recall": 0.8811700182815356, "f1-score": 0.8534749889331562, "support": 2188.0}, "O": {"precision": 0.9268028016178357, "recall": 0.8970686527260575, "f1-score": 0.9116933527413877, "support": 10473.0}, "Premise": {"precision": 0.8801698670605613, "recall": 0.8994905339958488, "f1-score": 0.8897253242915357, "support": 15899.0}, "accuracy": 0.858776119402985, "macro avg": {"precision": 0.8229256894472821, "recall": 0.8285922202792007, "f1-score": 0.8254200745892063, "support": 33500.0}, "weighted avg": {"precision": 0.8584358710936555, "recall": 0.858776119402985, "f1-score": 0.8584010941482899, "support": 33500.0}}
|
model.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 592324828
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:b4d90434bb82816205ef8b7455aa659f1e0a01a756fdd8163729ccb4e8dad86f
|
3 |
size 592324828
|