File size: 4,344 Bytes
b1ceeb7
 
0e7691b
b1ceeb7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
---
license: mit
base_model: lemon-mint/LaBSE-EnKo-Nano-Preview-v0.3
tags:
- generated_from_trainer
metrics:
- precision
- recall
- accuracy
model-index:
- name: ko-edu-classifier
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

It's a training checkpoint. I strongly recommend not to use this model 🤗

# ko-edu-classifier

This model is a fine-tuned version of [lemon-mint/LaBSE-EnKo-Nano-Preview-v0.3](https://huggingface.co/lemon-mint/LaBSE-EnKo-Nano-Preview-v0.3) on the None dataset.

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 256
- eval_batch_size: 256
- seed: 0
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.2
- num_epochs: 30

### Training results

| Training Loss | Epoch   | Step | Validation Loss | Precision | Recall | F1 Macro | Accuracy |
|:-------------:|:-------:|:----:|:---------------:|:---------:|:------:|:--------:|:--------:|
| 8.2847        | 0.9922  | 128  | 5.7695          | 0.0554    | 0.1667 | 0.0832   | 0.3326   |
| 2.9466        | 1.9845  | 256  | 2.4992          | 0.0297    | 0.1667 | 0.0504   | 0.1783   |
| 2.2442        | 2.9767  | 384  | 2.2770          | 0.0972    | 0.1779 | 0.0789   | 0.1884   |
| 2.11          | 3.9690  | 512  | 2.2539          | 0.1370    | 0.1917 | 0.1233   | 0.1966   |
| 2.0444        | 4.9612  | 640  | 1.9768          | 0.2723    | 0.2069 | 0.1448   | 0.2171   |
| 2.0458        | 5.9535  | 768  | 2.1823          | 0.1460    | 0.2022 | 0.1450   | 0.2021   |
| 2.0249        | 6.9457  | 896  | 2.0237          | 0.2773    | 0.2019 | 0.1478   | 0.2062   |
| 2.0141        | 7.9380  | 1024 | 2.0108          | 0.3220    | 0.2043 | 0.1498   | 0.2081   |
| 2.0178        | 8.9302  | 1152 | 1.9606          | 0.2890    | 0.2066 | 0.1513   | 0.2127   |
| 2.0145        | 9.9225  | 1280 | 2.0984          | 0.3189    | 0.2077 | 0.1561   | 0.2062   |
| 2.0093        | 10.9147 | 1408 | 1.9506          | 0.2829    | 0.2089 | 0.1517   | 0.2157   |
| 2.014         | 11.9070 | 1536 | 1.9494          | 0.3039    | 0.2086 | 0.1538   | 0.2152   |
| 2.0137        | 12.8992 | 1664 | 1.9247          | 0.3109    | 0.2110 | 0.1548   | 0.2190   |
| 2.0055        | 13.8915 | 1792 | 1.8977          | 0.3184    | 0.2121 | 0.1537   | 0.2223   |
| 2.0058        | 14.8837 | 1920 | 1.9747          | 0.3245    | 0.2094 | 0.1539   | 0.2130   |
| 1.9975        | 15.8760 | 2048 | 1.9288          | 0.3084    | 0.2109 | 0.1535   | 0.2187   |
| 1.995         | 16.8682 | 2176 | 1.8964          | 0.3036    | 0.2142 | 0.1590   | 0.2247   |
| 1.9959        | 17.8605 | 2304 | 1.9247          | 0.3164    | 0.2144 | 0.1605   | 0.2209   |
| 2.003         | 18.8527 | 2432 | 1.9297          | 0.3152    | 0.2151 | 0.1595   | 0.2217   |
| 1.9908        | 19.8450 | 2560 | 1.8936          | 0.3065    | 0.2144 | 0.1610   | 0.2256   |
| 1.9843        | 20.8372 | 2688 | 1.9238          | 0.3201    | 0.2168 | 0.1613   | 0.2242   |
| 2.0042        | 21.8295 | 2816 | 1.9712          | 0.3228    | 0.2095 | 0.1577   | 0.2119   |
| 1.9913        | 22.8217 | 2944 | 1.9070          | 0.3134    | 0.2168 | 0.1612   | 0.2250   |
| 1.9855        | 23.8140 | 3072 | 1.9155          | 0.3123    | 0.2166 | 0.1611   | 0.2242   |
| 1.9892        | 24.8062 | 3200 | 1.9338          | 0.3213    | 0.2163 | 0.1619   | 0.2220   |
| 1.9964        | 25.7984 | 3328 | 1.9309          | 0.3125    | 0.2167 | 0.1625   | 0.2226   |
| 1.9704        | 26.7907 | 3456 | 1.9165          | 0.3101    | 0.2187 | 0.1648   | 0.2258   |
| 1.9977        | 27.7829 | 3584 | 1.9165          | 0.3177    | 0.2193 | 0.1653   | 0.2264   |
| 1.9976        | 28.7752 | 3712 | 1.9127          | 0.3099    | 0.2191 | 0.1643   | 0.2269   |
| 1.9728        | 29.7674 | 3840 | 1.9129          | 0.3096    | 0.2186 | 0.1640   | 0.2264   |


### Framework versions

- Transformers 4.43.3
- Pytorch 2.4.0+cu118
- Datasets 2.19.1
- Tokenizers 0.19.1