Argument Mining
Collection
BERT models for argument component detection
•
4 items
•
Updated
•
1
bert-sci-am
is a BERT-family model trained for scientific literature argument mining. At low-level it performs sequence classification. This version is trained on 3-class classification on (david-inf/am-nlp-abstrct)[david-inf/am-nlp-abstrct] forked from pie/abstrct dataset.
from transformers import AutoModelForSequenceClassification, AutoTokenizer
def load_model():
"""Load model from hub"""
checkpoint = "david-inf/bert-sci-am"
model = AutoModelForSequenceClassification.from_pretrained(
checkpoint, num_labels=3)
tokenizer = AutoTokenizer.from_pretrained(checkpoint)
return model, tokenizer
model, tokenizer = load_model()