shenbinqian commited on
Commit
858b69a
1 Parent(s): c79fde3

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -4,7 +4,7 @@ license: cc-by-sa-4.0
4
  # Flair-abbr-roberta-pubmed-plos-unfiltered
5
 
6
  This is a stacked model of embeddings from [roberta-large](https://huggingface.co/FacebookAI/roberta-large), [HunFlair pubmed models](https://github.com/flairNLP/flair/blob/master/resources/docs/HUNFLAIR.md) and [character-level language models trained on PLOS](https://github.com/shenbinqian/PLODv2-CLM4AbbrDetection/tree/main/clm), fine-tuning on the [PLODv2 unfiltered dataset](https://github.com/shenbinqian/PLODv2-CLM4AbbrDetection).
7
- It is released with our LREC-COLING 2024 publication (coming soon). It achieves the following results on the test set:
8
 
9
  Results on abbreviations:
10
  - Precision: 0.8977
 
4
  # Flair-abbr-roberta-pubmed-plos-unfiltered
5
 
6
  This is a stacked model of embeddings from [roberta-large](https://huggingface.co/FacebookAI/roberta-large), [HunFlair pubmed models](https://github.com/flairNLP/flair/blob/master/resources/docs/HUNFLAIR.md) and [character-level language models trained on PLOS](https://github.com/shenbinqian/PLODv2-CLM4AbbrDetection/tree/main/clm), fine-tuning on the [PLODv2 unfiltered dataset](https://github.com/shenbinqian/PLODv2-CLM4AbbrDetection).
7
+ It is released with our LREC-COLING 2024 publication [Using character-level models for efficient abbreviation and long-form detection](https://aclanthology.org/2024.lrec-main.270/). It achieves the following results on the test set:
8
 
9
  Results on abbreviations:
10
  - Precision: 0.8977