gbhong's picture
initial commit
e3c14cc
|
raw
history blame
1.86 kB
metadata
license: apache-2.0
language:
  - en
metrics:
  - precision
  - recall
  - f1
base_model:
  - microsoft/BiomedNLP-BiomedBERT-base-uncased-abstract-fulltext
pipeline_tag: text-classification
library_name: transformers

Fine-tuned RE Model for DiMB-RE

Model Description

This is a fine-tuned Relation Extraction (RE) model based on the BiomedNLP-BiomedBERT-base-uncased model, specifically designed for sentence classification task to extract relations between extract entities for diet, human metabolism and microbiome field. The model has been trained on the DiMB-RE dataset and is optimized to infer relationship with 13 relation types.

Performance

The model has been evaluated on the DiMB-RE using the following metrics:

  • Relation with Factuality (w/ GOLD relations) - P: 0.926, R: 0.843, F1: 0.883
  • Relation with Factuality (Strict, end-to-end w/ predicted entities and relations) - P: 0.399, R: 0.322, F1: 0.356
  • Relation with Factuality (Relaxed, end-to-end w/ predicted entities and relations) - P: 0.440, R: 0.355, F1: 0.393

Citation

If you use this model, please cite like below:

@misc{hong2024dimbreminingscientificliterature,
      title={DiMB-RE: Mining the Scientific Literature for Diet-Microbiome Associations}, 
      author={Gibong Hong and Veronica Hindle and Nadine M. Veasley and Hannah D. Holscher and Halil Kilicoglu},
      year={2024},
      eprint={2409.19581},
      archivePrefix={arXiv},
      primaryClass={cs.CL},
      url={https://arxiv.org/abs/2409.19581}, 
}