BART Clinical Notes Summarizer (Fine-tuned on MIMIC-IV)

This is a fine-tuned version of facebook/bart-base for abstractive summarization of clinical notes, trained on the MIMIC-IV dataset.

🧠 Model Purpose

The model is designed to generate concise and informative summaries of lengthy medical notes, such as discharge summaries, progress notes, or radiology reports.

πŸ“¦ How to Use

from transformers import AutoModelForSeq2SeqLM, AutoTokenizer

model = AutoModelForSeq2SeqLM.from_pretrained("brandonhcheung04/bart")
tokenizer = AutoTokenizer.from_pretrained("brandonhcheung04/bart")

text = "Patient is a 75-year-old male with a history of congestive heart failure admitted for shortness of breath..."
inputs = tokenizer(text, return_tensors="pt", max_length=1024, truncation=True)

summary_ids = model.generate(inputs["input_ids"], max_length=150, num_beams=4, early_stopping=True)
print(tokenizer.decode(summary_ids[0], skip_special_tokens=True))
Downloads last month
16
Safetensors
Model size
406M params
Tensor type
F32
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support