Fine-tuned Flair Model on CO-Fun NER Dataset
This Flair model was fine-tuned on the CO-Fun NER Dataset using German BERT as backbone LM.
Dataset
The Company Outsourcing in Fund Prospectuses (CO-Fun) dataset consists of 948 sentences with 5,969 named entity annotations, including 2,340 Outsourced Services, 2,024 Companies, 1,594 Locations and 11 Software annotations.
Overall, the following named entities are annotated:
Auslagerung
(engl. outsourcing)Unternehmen
(engl. company)Ort
(engl. location)Software
Fine-Tuning
The latest Flair version is used for fine-tuning.
A hyper-parameter search over the following parameters with 5 different seeds per configuration is performed:
- Batch Sizes: [
16
,8
] - Learning Rates: [
3e-05
,5e-05
]
More details can be found in this repository. All models are fine-tuned on a Hetzner GEX44 with an NVIDIA RTX 4000.
Results
A hyper-parameter search with 5 different seeds per configuration is performed and micro F1-score on development set is reported:
Configuration | Seed 1 | Seed 2 | Seed 3 | Seed 4 | Seed 5 | Average |
---|---|---|---|---|---|---|
bs8-e10-lr5e-05 |
0.9346 | 0.9388 | 0.9301 | 0.9291 | 0.9346 | 0.9334 ± 0.0039 |
bs16-e10-lr5e-05 |
0.9316 | 0.9328 | 0.9341 | 0.9315 | 0.9248 | 0.931 ± 0.0036 |
bs8-e10-lr3e-05 |
0.9234 | 0.9391 | 0.9207 | 0.9191 | 0.9394 | 0.9283 ± 0.0101 |
bs16-e10-lr3e-05 |
0.9136 | 0.9269 | 0.9231 | 0.9251 | 0.9247 | 0.9227 ± 0.0053 |
The result in bold shows the performance of the current viewed model.
Additionally, the Flair training log and TensorBoard logs are also uploaded to the model hub.
- Downloads last month
- 5
Model tree for stefan-it/flair-co-funer-german_bert_base-bs8-e10-lr5e-05-2
Base model
google-bert/bert-base-german-cased