BART Clinical Notes Summarizer (Fine-tuned on MIMIC-IV)
This is a fine-tuned version of facebook/bart-base
for abstractive summarization of clinical notes, trained on the MIMIC-IV dataset.
π§ Model Purpose
The model is designed to generate concise and informative summaries of lengthy medical notes, such as discharge summaries, progress notes, or radiology reports.
π¦ How to Use
from transformers import AutoModelForSeq2SeqLM, AutoTokenizer
model = AutoModelForSeq2SeqLM.from_pretrained("brandonhcheung04/bart")
tokenizer = AutoTokenizer.from_pretrained("brandonhcheung04/bart")
text = "Patient is a 75-year-old male with a history of congestive heart failure admitted for shortness of breath..."
inputs = tokenizer(text, return_tensors="pt", max_length=1024, truncation=True)
summary_ids = model.generate(inputs["input_ids"], max_length=150, num_beams=4, early_stopping=True)
print(tokenizer.decode(summary_ids[0], skip_special_tokens=True))
- Downloads last month
- 16
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
π
Ask for provider support