Model summary

bert-sci-am is a BERT-family model trained for scientific literature argument mining. At low-level it performs sequence classification. This version is trained on 3-class classification on (david-inf/am-nlp-abstrct)[david-inf/am-nlp-abstrct] forked from pie/abstrct dataset.

How to use

from transformers import AutoModelForSequenceClassification, AutoTokenizer

def load_model():
    """Load model from hub"""
    checkpoint = "david-inf/bert-sci-am"
    model = AutoModelForSequenceClassification.from_pretrained(
        checkpoint, num_labels=3)
    tokenizer = AutoTokenizer.from_pretrained(checkpoint)
    return model, tokenizer

model, tokenizer = load_model()
Downloads last month
84
Safetensors
Model size
109M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including david-inf/bert-sci-am