File size: 4,357 Bytes
1fab626
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
---
license: apache-2.0
base_model: allenai/longformer-base-4096
tags:
- generated_from_trainer
metrics:
- accuracy
- f1
- precision
- recall
model-index:
- name: framing_classification_longformer_30_augmented_multi
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# framing_classification_longformer_30_augmented_multi

This model is a fine-tuned version of [allenai/longformer-base-4096](https://huggingface.co/allenai/longformer-base-4096) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.3585
- Accuracy: 0.6170
- F1: 0.1988
- Precision: 0.2058
- Recall: 0.2476

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30

### Training results

| Training Loss | Epoch | Step   | Accuracy | F1     | Validation Loss | Precision | Recall |
|:-------------:|:-----:|:------:|:--------:|:------:|:---------------:|:---------:|:------:|
| 1.282         | 1.0   | 7043   | 0.5773   | 0.1691 | 1.5099          | 0.1473    | 0.2326 |
| 2.5135        | 2.0   | 14086  | 0.5455   | 0.1008 | 2.2716          | 0.0779    | 0.1429 |
| 2.384         | 3.0   | 21129  | 0.5455   | 0.1008 | 2.6195          | 0.0779    | 0.1429 |
| 2.5536        | 4.0   | 28172  | 0.5455   | 0.1008 | 2.3823          | 0.0779    | 0.1429 |
| 2.3549        | 5.0   | 35215  | 0.5455   | 0.1008 | 2.3964          | 0.0779    | 0.1429 |
| 2.3181        | 6.0   | 42258  | 0.5455   | 0.1008 | 2.4343          | 0.0779    | 0.1429 |
| 2.4398        | 7.0   | 49301  | 0.5455   | 0.1008 | 2.4609          | 0.0779    | 0.1429 |
| 2.3715        | 8.0   | 56344  | 0.5455   | 0.1008 | 2.4317          | 0.0779    | 0.1429 |
| 2.5554        | 9.0   | 63387  | 0.5455   | 0.1008 | 2.3966          | 0.0779    | 0.1429 |
| 1.3177        | 10.0  | 70430  | 0.5830   | 0.1707 | 1.4776          | 0.1472    | 0.2352 |
| 1.3928        | 11.0  | 77473  | 0.6159   | 0.1750 | 1.5114          | 0.1470    | 0.2348 |
| 1.5202        | 12.0  | 84516  | 0.6159   | 0.1746 | 1.4525          | 0.1465    | 0.2337 |
| 1.4013        | 13.0  | 91559  | 0.5909   | 0.1625 | 1.4524          | 0.1399    | 0.2113 |
| 1.4087        | 14.0  | 98602  | 0.5955   | 0.1736 | 1.4572          | 0.1484    | 0.2385 |
| 2.3755        | 15.0  | 105645 | 0.5727   | 0.1420 | 1.9328          | 0.1193    | 0.1771 |
| 2.2211        | 16.0  | 112688 | 0.5943   | 0.1596 | 1.7707          | 0.1317    | 0.2043 |
| 2.0359        | 17.0  | 119731 | 0.5830   | 0.1506 | 1.9399          | 0.1248    | 0.19   |
| 1.7553        | 18.0  | 126774 | 0.5920   | 0.1580 | 1.8171          | 0.1306    | 0.2026 |
| 1.4321        | 19.0  | 133817 | 0.6125   | 0.1733 | 1.4162          | 0.1462    | 0.2317 |
| 1.4545        | 20.0  | 140860 | 0.6068   | 0.1728 | 1.4446          | 0.1466    | 0.2324 |
| 1.3939        | 21.0  | 147903 | 0.6148   | 0.1747 | 1.4451          | 0.1473    | 0.2345 |
| 1.4333        | 22.0  | 154946 | 0.5841   | 0.1702 | 1.4462          | 0.1474    | 0.2333 |
| 1.3013        | 23.0  | 161989 | 0.6170   | 0.1757 | 1.4099          | 0.1480    | 0.2363 |
| 1.397         | 24.0  | 169032 | 0.6170   | 0.1766 | 1.4181          | 0.1489    | 0.2385 |
| 1.4752        | 25.0  | 176075 | 0.6136   | 0.1727 | 1.3997          | 0.1444    | 0.2297 |
| 1.372         | 26.0  | 183118 | 1.4134   | 0.6170 | 0.1748          | 0.1471    | 0.2340 |
| 1.4563        | 27.0  | 190161 | 1.3920   | 0.6205 | 0.1775          | 0.1492    | 0.2394 |
| 1.3727        | 28.0  | 197204 | 1.3763   | 0.6125 | 0.1737          | 0.1465    | 0.2328 |
| 1.4587        | 29.0  | 204247 | 1.3585   | 0.6170 | 0.1988          | 0.2058    | 0.2476 |
| 1.2723        | 30.0  | 211290 | 1.3586   | 0.6136 | 0.1973          | 0.1967    | 0.2455 |


### Framework versions

- Transformers 4.32.0.dev0
- Pytorch 2.0.1
- Datasets 2.14.4
- Tokenizers 0.13.3