legacy-datasets/wikipedia
Updated • 121k • 629
How to use StanfordAIMI/RadBERT with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("fill-mask", model="StanfordAIMI/RadBERT") # Load model directly
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("StanfordAIMI/RadBERT")
model = AutoModelForMaskedLM.from_pretrained("StanfordAIMI/RadBERT")RadBERT was continuously pre-trained on radiology reports from a BioBERT initialization.
@article{chambon_cook_langlotz_2022,
title={Improved fine-tuning of in-domain transformer model for inferring COVID-19 presence in multi-institutional radiology reports},
DOI={10.1007/s10278-022-00714-8}, journal={Journal of Digital Imaging},
author={Chambon, Pierre and Cook, Tessa S. and Langlotz, Curtis P.},
year={2022}
}